Verifying backend data forever

I missed to save the database in connection with reinstalling OS. In connection with this i also had issues with getting stuck Verifying Back end. After deleting and recreating the database as described above the backup start working again.

This is the steps I did and the issues I had.

  1. Added backup manualy and pointed to same soruce and backup location (SFTP)

  2. Run Backup. Got warning and proposal to repai database which i did.

  3. After repairing database i run backup again and backup completed after going though all 180 GB of files. Log stated that backup completed successfully.

  4. Rerun backup, to check that everything work and that backup completed would complete quickly. But backup got stuck on “Verifying Backend”. I waited for more than 24 hours, stopped, restarted again, waited for 24+ hours, stopped, restarted. Live log message that it got stuck on was:

Jan 26, 2019 12:35 PM: Starting - ExecuteReader: SELECT “A”.“Hash”, “C”.“Hash” FROM (SELECT “BlocklistHash”.“BlocksetID”, “Block”.“Hash”, * FROM “BlocklistHash”,“Block” WHERE “BlocklistHash”.“Hash” = “Block”.“Hash” AND “Block”.“VolumeID” = 5525) A, “BlocksetEntry” B, “Block” C WHERE “B”.“BlocksetID” = “A”.“BlocksetID” AND “B”.“Index” >= (“A”.“Index” * 3200) AND “B”.“Index” < ((“A”.“Index” + 1) * 3200) AND “C”.“ID” = “B”.“BlockID” ORDER BY “A”.“BlocksetID”, “B”.“Index”

  1. Tried a number of things (Do not remember)

  2. Deleted and Recreated Database (After finding this post)

  3. Run backup, backup completed in about 15 hours. It did not get stuck on Verifing backend. Time was spend coping or checking files.

  4. Rerun backup a 3days later. Completed in less than 1 hour.

Thanks for sharing your experience. We know database stuff can get quite slow and while we’re working to improve it, it’s not happening very quickly.

We try to warn people it can take a long time, but “long time” means different things to different people. :slight_smile:

I tested a forced recreate on a 5+ year old laptop and aborted it after 3 weeks (had to reboot for software install). Since it was a test I simply restored there backup copy I had made and everything was fine.

I documented a smaller test of recreating ok try different powered machines from the same destination here, if you’re curious:

Note that while I did not test it, I suspect a recreate can be done anywhere and the resulting database copied to the ultimate Duplicati install machine.

This means that in theory, one could make a standalone database recreate script to run at the destination (assuming your destination can run things) then copy the recreated database to wire it’s needed.

Hello,

I have a question about the „Verifying backend data“ thing:

I am using duplicati for my local Win10-PC to make regular backups on an external usb-drive which has actually a size of 1,15TB and it worked very well so far. Unfortunately the PC froze yesterday while it was doing the backup so I had to shutdown the computer. After that I restarted the backup and I noticed that it took a long time for verifying backend data so I interrupted it by killing the Duplicati.GUI.TrayIcon.exe process in the task manager because it didn’t stop when I tried to interrupt it in the browser. Next I tried “Verify files” in “Advanced” and also “Repair” in “Database”. When I started the backup again it got into the “Verifying backend data” mode after a minute again. So I left it over night and after several hours the status was the same. I have read in the forum, that some users had to wait for a long time to finish this process, but what makes me doubtful is that when I look at the taskmanager while duplicati is in the “Verifying backend data” mode, the both harddisks for destination and source data and also the OS-system harddisk are continuous idle. Only the CPU ist constantly working with 30%.

So is it worth to let duplicati work over a day or longer while only the CPU is working? Or is it possible that the verifying process got stuck?

The version before was 2.0.2.1 now I tried the update to 2.0.4.5 with no results. I also tried the no-backend-verification = true option

Thank you in advance for your help.

Best regards,

Christian

I’d recommend checking the About -> System info page and scrolling down to the bottom section where you should see a “lastPgEvent” block. Watch that for changes to see if Duplicati is still working on things.

Did you find a solution to your problem? I’m experiencing your exact same issue, and I’m out of options on what to do next.

Edit: My job did not actually finish in 4 hours, it is still not there after 6 days… So be patient or start over.

Me too, I’ve got one albeit very slow server that’s been at this for over a week and shows no sign of ever ending. The only good thing is that it’s still going and not simply hung.

So for some reason my backup job suddenly takes 3,5 hours to complete, instead of the regular 30 mins it used to take. Good that it does complete, but for my use, 3,5 hours is impractically long.

yes, after the crash i did the backup and it used several hours. I also checked the About -> System -> “lastPgEvent” block which was always updating. After that the backup worked as usual.

So if you have a crash during a backup, you must be patient for the next try.

Several hours is fine, but over week isn’t - I thought it was related to being a Wasabi storage backup but another server has decided to do the same and it’s a local SMB storage backup. It’s also not been failing so why it’s decided to do this I have no clue.

lastPgEvent : {"BackupID":"3","TaskID":5,"BackendAction":"Put","BackendPath":"ParallelUpload","BackendFileSize":943945,"BackendFileProgress":943945,"BackendSpeed":9,"BackendIsBlocking":false,"CurrentFilename":null,"CurrentFilesize":0,"CurrentFileoffset":0,"Phase":"Backup_PreBackupVerify","OverallProgress":0,"ProcessedFileCount":0,"ProcessedFileSize":0,"TotalFileCount":118729,"TotalFileSize":825545822534,"StillCounting":false}

I was forced to reboot the second server, because Windows, and it was doing nothing else other than the verifying backend of the SMB stored backup. On restart it’s now doing the same but for the main Wasabi stored back up job.

What is going on? Anyone? Please?

**The cause should be fixed in 2.0.4.17_canary_2019-04-11 although backups will still need to recreate and upload any that got missed, but they should no longer get missed.

A third machine, this time a Windows 10 PC, has started doing the same. Another where backups have been 100% for weeks.

FYI, currently running 2.0.4.16_canary_2019-03-28 on all systems.

More digging around and I discovered this: Backups appear to be missing some index files · Issue #3703 · duplicati/duplicati · GitHub

** The cause should be fixed in 2.0.4.17_canary_2019-04-11 although backups will still need to recreate and upload any that got missed, but they should no longer get missed.

What at least helped in my case, was stopping the backup, stopping the service, opening the database with an SQLITE editor (e.g. DB Browser for SQL Lite) and running the sql command ANALYZE.

Afterwards all SELECTS and stuff were running in a reasonable time.
This might also be a hint for the developers how to fix this. I had the same issue also with recreating a database (Queries running sometimes for 3-4 days). Maybe this could generally solve the slow access on the database.

3 Likes

Firstly, welcome to the forums

Secondly, thank-you oh so much, you’ve helped me fix the two machines I’ve been waiting nearly two weeks to complete their verifications. Both completed within a few hours after running the ANALYZE on their respective databases. I could instantly tell it was different as each was uploading new files every few seconds instead of one or two per hour.

Would be nice to see this added to Duplicati as a native action.

GitHub issue created here:

2 Likes

Hopefully this will help mine too!

The other bottleneck seems to be CPU usage, when “Verifying backend data …” and putting .dindex.zip.aes files on to the remote storage it is maxing out one CPU core only. I assume it would be a mammoth task to get this multithreaded?