I’m approaching Duplicati for the first time and I’m really loving it. I just have a question I can’t find an answer for: given the fact that I don’t change any setting for a specific configuration job, can I consider a job export JSON file valid forever? I took an export with some days of distance, and the files are actually different: “LastRun” and all the fields in “Metadata”. Is this difference important, or can Duplicati restore this information? If my backup server dies and I configure a new one, importing a config file from 6 months ago, will everything align smoothly? Or should I export the configuration after each job run?
no, it will not block you to restore data.
the only small risk that could happen if it’s current Duplicati version is not compatible (has a bug or something). This has happened recently as Duplicati has deprecated some api while it’s still used by a few persons I think.
So you should save also each new Duplicati version you install. That would mean that you manage updates yourself, and not click on update each time Duplicati proposes you a new version.
You can download old versions yes, but it’s better to have it in a safe place IMO, it saves you time when you want to do a restore.
Thank you! Yeah, backing up a fresh configuration on an upgrade is not a problem, I’m usually very careful when I do upgrades. My biggest fear was the need of having a daily config export, I was pretty sure is was unnecessary but I preferred to ask
To clarify, @gpatel-fr means that you keep a copy of the Duplicati setup file saved with your configuration file, so that they match together.
I do this as a matter of course, in case something happened and I couldn’t get the setup files (as a fail safe, in case something happened to the developers of the various backup software that I use)…now there is another ‘bonus’ reason that I didn’t consider!
So, in my “disaster recovery” storage I have the Duplicati setup file: duplicati-184.108.40.206_beta_2021-06-17-x64.msi
and a configuration export of each job.
Every time there is an update to Duplicati, I re-download the msi and update it on my disaster recovery storage and make new configuration exports.
Every time I make a change to a job, I make a new configuration export and re-upload it to my disaster recovery storage.
I’ve even gone so far as to make a copy and paste text note of the default options under Settings (might be important, depending on what you have in there).
In terms of what disaster recovery storage to use, I suggest that it is independent of the storage you are using for Duplicati, whilst still being offsite and easily accessible in a disaster recovery scenario. So you could upload these files to Dropbox, Onedrive or Google Drive.
I suggest that if you are encrypting the configuration exports, that the encryption passphrase is stored in a password manager which is synced to the cloud, thereby, it is readily available in a disaster recovery scenario. Consider also putting other login and backup storage credentials that you could need in a password manager, if you are happy with that.
If you are using Duplicati to put backups on the cloud, by taking care of its configuration files etc. in this way, in my opinion, you will then be well protected against a fire/flood scenario.
Such informational-type data (such as last run) in the export is not at the backup destination to restore,
but isn’t essential either. It would be more relevant if you were doing something like doing an export to immediately put back into action via an import to a different Duplicati installation. You have to ask for it:
If you don’t ask to import your metadata, your home screen can’t show it until a backup has been run. Importing a stale export for disaster recovery will show stale statistics awhile, but that’s not a big deal.
This good idea could be used as a second level safety to store the essentials you’d need to run “Direct restore from backup files”. Basically, be sure you have core things that you’d otherwise maybe forget…
The most essential thing is probably your data, so you need the destination and encryption information.
Configuration (like what folders you backed up) can be entered in again, although config import is nicer.
As always, testing restore (and restore procedure) occasionally is important to make sure all stays well. Duplicati’s use of a local database complicates disaster recovery because database must be recreated. This is never instant, but can also get slower if the recreate process finds a problem (e.g. missing data).