I’m just starting with Duplicati and I’m still a bit unsure what values to take for Remote Volume Size and block size. Can I use the default settings for the following backups or should I adjust them? My internet connection is stable.
All backups will go to OneDrive and Google Drive (if this matters).
About 100 GB
Rarely changes but very important files
About 1 TB
Changes per month about 1 GB - 10 GB (or less).
- Do I need to take any other safety measures?
- Should I take any special precautions for the database?
- If I lost my backup-pc, can I easily restore my cloud backup from a second pc? What do I need for this?
Sorry if my questions are answered before somewhere, it just gets confusing for me and I just want to be sure, to do everything right.
if you have 2 backup targets and a small but very important amount of data to backup, it should make sense to have 2 separate backups of this data set, to have 2 independent backups.
this is entirely for you to test ! don’t take vague assurances, check for yourself because only you can assert that your setup is correct. If you don’t test your first backup by wiping totally your backup setup and try to restore some data, well, you are living dangerously.
What I do is have separate backups (several) of the configuration (export as json), where I don’t save the passwords. In another separate set of backups, I save all credentials and the all important encryption key. There are sometimes ways to get back lost credentials for cloud backups, but nobody will give you a lost encryption key.
So do all these separate backups (the encryption key can be saved on old fashioned paper for example), wipe your backup setup, recreate it and enter again all credentials without having anything used from the backup system, rebuild the database from the cloud backup(s) and try to restore. This procedure is not specific to Duplicati, any backup system should be verified this way before letting run itself.
In a ‘serious’ context, this verification should be done on a regular basis.
You can leave backup job 1 with default settings. Backup job 2 is a bit tricky as the volume of data is rather large and you haven’t mentioned anything about your provider. In a general sense I would change the block size to 500K and the remote volume size to 150MB. Now to your questions:
- Always take safety measures. Regularly check on your backups. Configure email reports and check them. Periodically restore a sample of the data to make sure backup jobs are consistent and verify that you’re backing up appropriate data.
- There’s always a remote copy of the database, but your 1TB back is rather large and that may take a while to recreate. See below.
- Duplicati and backup of your ‘backup jobs’ is all that is needed for a DR restore. Simply import your jobs into a new Duplicati install. Walk through the wizard to disable the schedule. Recreate the database. Run the restore. There have been some reports of the database restore taking a long time, so if that’s a problem for you, you could setup a 3rd job to backup the databases of your 2 backup jobs.
Great advice from everyone. Only thing I’ll add is that backing up to online storage that’s accessible from the computer as ordinary files will probably allow malware such as ransomware to destroy your backups. Consider using some storage that’s tough to access as seemingly ordinary files, and you’ll be a bit safer.