I have an UnRaid server and have installed Duplicati to perform backup tasks. I have two backup configurations:- Photographs, and Documents. It’s the Photographs one I am discussing here. It’s backing up a directory approximately 1Tb in size to a USB external disk also attached to the server. It did perform the initial backup ok quite a while back but recently when I start the backup job running it just sits on " Photographs: Verifying backend data" . I thought i’d leave it overnight last night and it made no difference. It’s still sat verifying backend data this morning. Will it ever finish? What is it actually doing? Is it actually doing anything? Does it have to do this everytime?
I’m wondering if this is happening as I don’t leave the server turned on all the time. I just switch it on on an as needed basis.
After each backup, Duplicati will download a few files from the destination to check the integrity of the backup. If you haven’t changed the default block size, then this should amount to around 50 MB being downloaded.
What version of Duplicati are you running? Also, can I ask why you’re backing up to an external USB disk and not to a drive in the array? I’m also backing up to an UnRaid server and am curious about the pros and cons of different configurations.
I’ll have to check what the block size is set to as I may have changed it initially. I’ll also come back and post the version but I did update Duplicati this morning. I’m using Duplicati to back up a directory on the array to an external USB disk connected to the UnRaid server just as another place to have it backed up. Once I have this backup running and running reliably I want to duplicate it as an offsite backup using Backblaze B2. That’s the eventual goal.
I’m also running Dupliacti on unRAID (official Docker container) but am not backing up to USB (just an array disk and cloud storage).
Depending on what version of Duplicati you were using, you may have been bumping into a bug where the progress bar wasn’t being updated correctly even though the backup had progressed past the verification.
Note that while warwickmm is correct about the end-of-backup “can I download and verify a file” tests, I believe the issue you’re facing is the pre-backup “does the destination have all the files I think it should have” check.
If updating to a newer version of Duplicati doesn’t resolve the issue, consider TESTIGN with this parameter - if it resolves it, then we’ve isolated the issue and can move forward from there:
--no-backend-verification
If this flag is set, the local database is not compared to the remote filelist on startup. The intended usage for this option is to work correctly in cases where the filelisting is broken or unavailable.
Default value: “false”
I don;t think it is this issue as the backup has been in progress since 8am this morning and when I check in About it shows: “Phase”:“Backup_PreBackupVerify”.
I have enabled that flag option too but obviously it is not making any difference. I’m running version Duplicati - 2.0.3.3_beta_2018-04-02
Unfortunately, that page does indeed seem to imply it’s still verifying but has no progress for some reason. (Of course I haven’t had much experience with 1TB backups, so this could be normal?)
Hopefully @Pectojin has a moment to take a look and give his opinion.
I’m a little confused by that last log. It’s deleting a dindex file, so perhaps there’s an inconsistency between the DB and the index? I’m honestly not sure if that’s something that the verification process deals with.
Backup_PreBackupVerify and Backup_PostBackupVerify both say Verifying backend data ..., but they’re different verifications. The Phase from lastPgEvent, combined with failure Fatal error at the same minute as Server has started suggest this is the early test which may delete remote volumes in odd states such as Temporary, Deleting, or Uploading. There are Information-level log messages around that explain actions, so probably the next backup test should set –log-level=Information or higher, and either use –log-file or Commandline (web version is fine) to get a better view. We may also catch a Delete sequence. An example:
I don’t know how the final failure happens, but at least we can see if we can get a better idea of other events. Perhaps @rctneil can see if duplicati-i49f8c37a24dc4d1fa5cf9c138e93216a.dindex.zip is on the USB drive.
Personally, I’d try again for a log view. I’m not sure why your last one had so little, but maybe it was timing, so please open a second browser tab for MENU → About → Show log → Live → Information then try backup from the first tab. If you don’t like browser studying, you can instead use the --log-file and --log-level options, however my file seemed to ignore Information and go for the even-more-wordy Profiling which is even better, however you probably want to turn that off when you’re not chasing an issue (or at least trim it occasionally).
@ts678 Nothing seemed to be making any difference so I told it to delete and recreate the database. This seems to have made a difference and the current progress says:
Looks mostly reasonable assuming the plan progressed from “delete and recreate the database” to a new backup. You can get a file count and size from your OS tools, but here the average source file size is 7MB (plausible for photos, for some cameras), and surprising average destination dblock file size is 2GB, which presumably was a manual setting but seems high. I’d worry about loss of a backup file losing many photos.
Backup by default will do some sampled verification of the destination files after the backup runs, configurable by –backup-test-samples and with the meaning of a sample explained in The TEST command, so using a 2GB –dblock-size will probably mean a bit of a slowdown there even if the number of new photos uploaded is small.