How to test / verify your backups

This is a summary of some basic methods for testing your backup files without doing a full restore. It is written based on canary but should work just fine going back to at least beta.

This post is a wiki - if you see something incorrect, outdated, or just plain missing, feel free to fix or add it yourself by clicking on the image button at the bottom right of this post! :smiley:

Why would I want to test my backup files?

While by default Duplicati already tests 1 “random” set of backup files (a “fileset” = 1 dindex + 1 dlist + 1 dblock) after each backup run, it’s possible backups are generating more than 1 set of files per run meaning at 1 test per run Duplicati will never get around to testing all files.

Alternatively, maybe you had a “scare” with one of your drives (bad S.M.A.R.T messages or a dropped USB drive) and you want to double check that everything is OK.

Or perhaps you have moved your destination files from one provider to another (maybe even as part of a backup seed) and want to confirm everything got copied around OK.

Why might I NOT want to test my backup files?

Running test or verify will download files from your destination - potentially ALL of them. Depending on your connection this can take a while, use up bandwidth (perhaps hitting usage caps), and slow down other things on your network (though you can use the --throttle-* commands to minimize that issue).

Also, while this is a pretty good method of checking your backups, nothing replaces a good old fashioned FULL RESTORE. :wink:

How to run a GUI based test / verify

  1. Click “Commandline …” in the job GUI

  2. Select the test command (verify will also work - they do the same thing)

  3. Replace all existing “Commandline arguments” with all (or a specific number of filesets you want to test)

  4. Optionally add additional parameters (either on their own lines in “Commandline arguments” or using the “Advanced options” interface) such as:
    a. --full-remote-verification=true (download, decrypt, compare to database, then test extract & hash check random 30% of contents from each dblock file - otherwise it just does a hash check of the archive file itself, not the individual content files)
    b. --console-log-level=XXXX (show additional info in the console at level XXXX)
    C. --console-log-filter=YYYY (filter console results to those of type YYYY)

    Note that there is no need to remove any pre-existing “Advanced parameters” as some might actually be needed (such as --passphrase and --dbpath). Also, parameters you add as part of a Commandline run are NOT saved so if you are planning to do multiple Commandline runs you will have to add them each time

  5. Click the blue Run "test" command now (or Run "verify" command now) button at the bottom of the page

Note that --console-log-level and --console-log-filter parameters are only available in version and higher. If using or lower, either don’t use those parameters or replace them with --log-file=[path] and --log-level=XXXX which will put the info into a log file (but no on the console, sorry).

Here's a working copy/pastable example for 'Commandline arguments'

Remove the -*RemoteOperationsGet*; text if you want to see the names of EVERY file downloaded whether or not problems are found with them.

How to run a CLI based test / verify

Run your Duplicati.CommandLine.exe test XXX command where XXX is all the parameters mentioned above. For example:

Duplicati.CommandLine.exe test "[path to my destination]" all --dbpath="[path to my sqlite DB]" --console-log-level=profiling --console-log-filter=-*.Database.*

Geeky notes

Running test (or verify) will download one fileset (1 each of dindex, dlist, and dblock files) at a time to your temp folder (you should see dup-* files coming & going in there), then test them, then delete them - so you shouldn’t need more temp storage than a little more than your “Upload volume size” (dblock) size.

HOWEVER - eventually ALL files will have been downloaded from the destination, so be sure to keep that in mind if you have usage caps (see below info about “random”). There should be no UPLOAD bandwidth usage as part of this process.

For the curious ones out there, Duplicati keeps track of how many times a file has been tested so when “randomly” choosing the next file to test it will only pull from files with the least number of tests logged against them. This means that running multiple partial tests (such as test 100 instead of test all) will make sure all files are tested at least once before re-testing files.

This can be handy if you have usage caps to manage as it means you can run partial tests over multiple time periods without worrying about “over-testing” some files while ignoring others.

Personally, I like detailed logs WITHOUT the database calls so I use:


Only problem, at least with, is running with this gives the following warnings:

The supplied option --console-log-level is not supported and will be ignored
The supplied option --console-log-filter is not supported and will be ignored

Have these options been deprecated, and if so, what replacement is available. If they are supposed to be valid, what am I doing wrong (in the GUI I click commandline, set to test, erase my backup path from the commandline box and paste in your content).


Thanks for reminding me. I’ll update the post letter but for now you’ve bumped into the vs logging change.

Basically, the “console-log-*” (and some other advanced logging) parameters weren’t added until

For now you’ll just have to ignore the warning (or remove the parameters). :frowning:

Hi, is there an option to run this without local DB? Or is this always required?

I am trying to run this command:

duplicati-cli test ssh://localhost:22222/backups/documents 1 --auth-password=***** --auth-username=dup-**** --ssh-key=sshkey%3A%2F%2F-***---- --ssh-fingerprint=ssh-rsa***** --no-local-db=true --passphrase=**** --verbose=true --full-remote-verification=true

But this gives the error:

Database file does not exist: /root/.config/Duplicati/RCBJZTRKXP.sqlite

Why is it trying to create a database? Can I not test without a local database?

The reason I am asking is because I want to do verification serverside, not client side. But the server is not the one making those duplicati backups.

1 Like

Part of the verification that Duplicati does checks things like file size and hashes against the local database.

I’m not sure if Duplicati can do a full verify without the database, but there are some other standalone tools / scripts that can do things like CRC and decompression verifications.

Perhaps this will help:

Ok then, if I go by that thread the only real solution in my remote case would be to either let it rebuild the database or do a full recover.

The documentation is lacking a bit in this regard. How do I do a commandline recover of the latest version of all files of a backup ?

So, replying to myself here. But this is the script I finally settled upon (anonymized a bit)


rm -rf /tank02/ds02/temp/duplicatiRestoreTest/dup-name/
mkdir -p /tank02/ds02/temp/duplicatiRestoreTest/dup-name/

duplicati-cli restore ssh://localhost:22222/backups/documents --auth-password=Password --auth-username=dup-name --ssh-key="sshkey://-----BEGIN%20RSA%20PRIVATE%20KEY---SshPrivateKeyValue--END%20RSA%20PRIVATE%20KEY-----" --ssh-fingerprint="ssh-rsa 2048 11:**:55" --no-local-db=true --passphrase=PassPhrase --restore-path=/tank02/ds02/temp/duplicatiRestoreTest/dup-name/ --send-mail-from=** --send-mail-to=** --send-mail-url=smtps:// --send-mail-username=** --send-mail-password=MailPassword --send-mail-any-operation=true --send-mail-subject="Duplicati Backup report for name-verify"

rm -rf /tank02/ds02/temp/duplicatiRestoreTest/dup-name/

The report subject name was tinkered a bit so dupReport picks it up and keeps me posted of backups to the “verify” location.

1 Like

That looks like a nice solution if you’ve got the destination space for a test restore, thanks for sharing the script!