I’ve been using Duplicati for 2-3 years now. My working data set on my laptop is large (1.5 TB) and I back it up daily to four destinations: 2 x local folders (on external drives) and 2 x self-managed S3 servers running Minio in different off-site locations.
Over the years Duplicati has always had trouble corrupting the local database. I always had the impression, but couldn’t confirm it, that it would happen when the backup task was interrupted (unexpected shutdown, system crash, or forced shutdown).
However, on the latest version (220.127.116.11_beta_2021-05-03) I have found that it will corrupt the database every single time it tries to create the backup whenever the backup is unavailable. For an external drive that means unmounted, and for the S3 servers the computer is offline. The database corruption quickly became a big problem because the repair would fail and the database would have to be recreated. On a number of occasions the recreation would fail and I would have no option but to nuke the backup set and the database and just start over. None of this is really news and the bug reports are full of similar examples.
The reason for my post is to provide some practical advice for how to overcome this issue. What I have discovered recently is that Duplicati already has a feature that can mitigate this issue, and it’s a very elegant and neat solution. What I have done is use the “run-script-before-required” option and point it at a script that checks for the presence of the destination. I have done this on Linux Mint, so the examples below may not work for everyone, but the concept is very sound and I highly recommend it to all Duplicati users that may have experienced similar issues. You can easily write similar scripts for Windows or other clients.
I have been running for over a month like this without any corrupted databases, which previously would happen 2-3 times per week. So I’m very happy now and hope this will bring some joy to other users. No doubt this basic functionality needs to be added to the core routine, and I’ve no doubt the development team are doing the best they can and will get to it when it is possible to do so.
Here are the scripts I’ve been using:
Script to check for presence of external drive (note there is a small detail, I use a sub-folder on the external drive to hold the back up files. This means the path check only returns true when the drive is actually mounted. If you use the path check on the mount point then it will not work as expected since the mount point typically exists prior to the drive being mounted):
#!/bin/bash # Does the target path exist? [ ! -d "/path/to/target/folder" ] && exit 1 # All good exit 0
Script to check for presence of remote server:
#!/bin/bash # Can we ping the device? Try twice :) ping -c 2 targethost.com > /dev/null && exit 0 # No dice! exit 1