Duplicati Backup and Backing up Duplicati

After some help from this forum, and a lot of reading, experimentation and failures I am now able to use Duplicati, from a FreeNAS Jail backup my SMB style data on the FreeNAS to another NAS (Non FreeNAS) on my LAN. Thank you for that.
I have six different jobs defined, which range from 2GB to 10+TB and I have made some adjustments to things like blocksize and remote volume size on the large of the jobs.

I have exported each job to a file, and stored that file safely away somewhere. I also snapshot the jail and copy the snapshot to an alternative pool just in case I lose the pool.

It does occur to me however that there is something missing in Duplicati, only required in the event of a catastrophic failure of the overall system that would require a complete rebuild. If we assume that I lose my FreeNAS Server totally (hopefully unlikley) and have to rebuild, maybe on new hardware. I can re-install Duplicati in the jail but I may or may not have access to the job details and I do not want to have to rebuild databases that in some cases are apparently taking days to rebuild.

Would it not be sensible to do the following:
With each duplicati job, export all the job details, databases etc to the destination folder encrypted with a suitable password (which I would have to remember) so that inthe event of a rebuild I can point at the destination folder and quickly download everything I need for that backup job so that I can quickly start the laborious process of a restore.

Archiware Pure (VM Backups) does something like that - the config is stored with the backups. If I have to rebuild the server - I just point the new install at the destination and everything is golden.

Also the ability to export the duplicati config and databases, on a schedule with no interaction to a specified location would also help with the disaster recovery process

Apologies if this already exists some of the stuff I am reading is quite old.

For a while I used a post-backup script that copied the job Database to an alternate location, but I stopped doing it when improvements were made and bugs fixed in the database recreation process. You can test the recreate process yourself to see how quickly the database is rebuilt. You may find that doing separate backups of it is unnecessary. If you DO decide you want to back up the database, it’s important to do it after every backup job.

Keeping exports of the job configurations is very helpful. I would resave the export any time you reconfigure the backup job. You can also consider backing up Duplicati-server.sqlite. It contains all job definitions and global settings. (Global settings are not saved when you export a job config.)

As I am running in a jail, I think I will just snapshot the iocage datapool after a backup sequence completes (estimated) and then replicate that snapshot elsewhere. I think that should do