Verifying backend data forever

#16

@ts678 Just done a search in the file listing for that file and it was not found.

What should I try next? Just be aware i’m very new to my Unraid server and Duplicati.

#17

Personally, I’d try again for a log view. I’m not sure why your last one had so little, but maybe it was timing, so please open a second browser tab for MENU --> About --> Show log --> Live --> Information then try backup from the first tab. If you don’t like browser studying, you can instead use the --log-file and --log-level options, however my file seemed to ignore Information and go for the even-more-wordy Profiling which is even better, however you probably want to turn that off when you’re not chasing an issue (or at least trim it occasionally).

#18

@ts678 Nothing seemed to be making any difference so I told it to delete and recreate the database. This seems to have made a difference and the current progress says:

Current action: Backup_ProcessingFiles

I will update here if it completes successfully.

#19

Ok, The result of the backup is below: Does it look ok to you?

Dec 2, 2018 11:12 AM: Result
DeletedFiles: 0
DeletedFolders: 0
ModifiedFiles: 0
ExaminedFiles: 162240
OpenedFiles: 162240
AddedFiles: 162240
SizeOfModifiedFiles: 0
SizeOfAddedFiles: 1156760384827
SizeOfExaminedFiles: 1156760384827
SizeOfOpenedFiles: 1156760384827
NotProcessedFiles: 0
AddedFolders: 839
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
CompactResults: null
DeleteResults:
DeletedSets:
Dryrun: False
MainOperation: Delete
CompactResults: null
ParsedResult: Success
Version: 2.0.4.5 (2.0.4.5_beta_2018-11-28)
EndTime: 12/2/2018 11:12:57 AM (1543749177)
BeginTime: 12/2/2018 11:11:42 AM (1543749102)
Duration: 00:01:15.4128610
Messages: [
2018-12-01 14:42:29 +00 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started,
2018-12-01 14:47:26 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bf03f0e5b0e8f43359dbae85a5afe82fb.dblock.zip (2.00 GB),
2018-12-01 14:47:37 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bf03f0e5b0e8f43359dbae85a5afe82fb.dblock.zip (2.00 GB),
2018-12-01 14:47:38 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i4673d366abea4676851f160da9859cbe.dindex.zip (1.40 MB),
2018-12-01 14:47:39 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i4673d366abea4676851f160da9859cbe.dindex.zip (1.40 MB),

]
Warnings:
Errors:
BackendStatistics:
RemoteCalls: 1047
BytesUploaded: 1122542034171
BytesDownloaded: 0
FilesUploaded: 1047
FilesDownloaded: 0
FilesDeleted: 0
FoldersCreated: 0
RetryAttempts: 0
UnknownFileSize: 0
UnknownFileCount: 0
KnownFileCount: 1047
KnownFileSize: 1122542034171
LastBackupDate: 12/1/2018 2:42:29 PM (1543675349)
BackupListCount: 1
TotalQuotaSpace: 2999127797760
FreeQuotaSpace: 759400058880
AssignedQuotaSpace: -1
ReportedQuotaError: False
ReportedQuotaWarning: False
ParsedResult: Success
Version: 2.0.4.5 (2.0.4.5_beta_2018-11-28)
Messages: [
2018-12-01 14:42:29 +00 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started,
2018-12-01 14:47:26 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bf03f0e5b0e8f43359dbae85a5afe82fb.dblock.zip (2.00 GB),
2018-12-01 14:47:37 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bf03f0e5b0e8f43359dbae85a5afe82fb.dblock.zip (2.00 GB),
2018-12-01 14:47:38 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i4673d366abea4676851f160da9859cbe.dindex.zip (1.40 MB),
2018-12-01 14:47:39 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i4673d366abea4676851f160da9859cbe.dindex.zip (1.40 MB),

]
Warnings:
Errors:
RepairResults: null
TestResults: null
ParsedResult: Success
Version: 2.0.4.5 (2.0.4.5_beta_2018-11-28)
EndTime: 12/2/2018 11:12:57 AM (1543749177)
BeginTime: 12/1/2018 2:42:29 PM (1543675349)
Duration: 20:30:28.3308700
Messages: [
2018-12-01 14:42:29 +00 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started,
2018-12-01 14:47:26 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bf03f0e5b0e8f43359dbae85a5afe82fb.dblock.zip (2.00 GB),
2018-12-01 14:47:37 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bf03f0e5b0e8f43359dbae85a5afe82fb.dblock.zip (2.00 GB),
2018-12-01 14:47:38 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i4673d366abea4676851f160da9859cbe.dindex.zip (1.40 MB),
2018-12-01 14:47:39 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i4673d366abea4676851f160da9859cbe.dindex.zip (1.40 MB),

]
Warnings:
Errors:

#20

Looks mostly reasonable assuming the plan progressed from “delete and recreate the database” to a new backup. You can get a file count and size from your OS tools, but here the average source file size is 7MB (plausible for photos, for some cameras), and surprising average destination dblock file size is 2GB, which presumably was a manual setting but seems high. I’d worry about loss of a backup file losing many photos.

ExaminedFiles: 162,240
OpenedFiles: 162,240
AddedFiles: 162,240
SizeOfAddedFiles: 1,156,760,384,827
SizeOfExaminedFiles: 1,156,760,384,827
SizeOfOpenedFiles: 1,156,760,384,827

BytesUploaded: 1,122,542,034,171
KnownFileCount: 1047
KnownFileSize: 1,122,542,034,171

Choosing sizes in Duplicati
Remote Volume Size

Backup by default will do some sampled verification of the destination files after the backup runs, configurable by –backup-test-samples and with the meaning of a sample explained in The TEST command, so using a 2GB –dblock-size will probably mean a bit of a slowdown there even if the number of new photos uploaded is small.

#21

I missed to save the database in connection with reinstalling OS. In connection with this i also had issues with getting stuck Verifying Back end. After deleting and recreating the database as described above the backup start working again.

This is the steps I did and the issues I had.

  1. Added backup manualy and pointed to same soruce and backup location (SFTP)

  2. Run Backup. Got warning and proposal to repai database which i did.

  3. After repairing database i run backup again and backup completed after going though all 180 GB of files. Log stated that backup completed successfully.

  4. Rerun backup, to check that everything work and that backup completed would complete quickly. But backup got stuck on “Verifying Backend”. I waited for more than 24 hours, stopped, restarted again, waited for 24+ hours, stopped, restarted. Live log message that it got stuck on was:

Jan 26, 2019 12:35 PM: Starting - ExecuteReader: SELECT “A”.“Hash”, “C”.“Hash” FROM (SELECT “BlocklistHash”.“BlocksetID”, “Block”.“Hash”, * FROM “BlocklistHash”,“Block” WHERE “BlocklistHash”.“Hash” = “Block”.“Hash” AND “Block”.“VolumeID” = 5525) A, “BlocksetEntry” B, “Block” C WHERE “B”.“BlocksetID” = “A”.“BlocksetID” AND “B”.“Index” >= (“A”.“Index” * 3200) AND “B”.“Index” < ((“A”.“Index” + 1) * 3200) AND “C”.“ID” = “B”.“BlockID” ORDER BY “A”.“BlocksetID”, “B”.“Index”

  1. Tried a number of things (Do not remember)

  2. Deleted and Recreated Database (After finding this post)

  3. Run backup, backup completed in about 15 hours. It did not get stuck on Verifing backend. Time was spend coping or checking files.

  4. Rerun backup a 3days later. Completed in less than 1 hour.

#22

Thanks for sharing your experience. We know database stuff can get quite slow and while we’re working to improve it, it’s not happening very quickly.

We try to warn people it can take a long time, but “long time” means different things to different people. :slight_smile:

I tested a forced recreate on a 5+ year old laptop and aborted it after 3 weeks (had to reboot for software install). Since it was a test I simply restored there backup copy I had made and everything was fine.

I documented a smaller test of recreating ok try different powered machines from the same destination here, if you’re curious:

Note that while I did not test it, I suspect a recreate can be done anywhere and the resulting database copied to the ultimate Duplicati install machine.

This means that in theory, one could make a standalone database recreate script to run at the destination (assuming your destination can run things) then copy the recreated database to wire it’s needed.

#23

Hello,

I have a question about the „Verifying backend data“ thing:

I am using duplicati for my local Win10-PC to make regular backups on an external usb-drive which has actually a size of 1,15TB and it worked very well so far. Unfortunately the PC froze yesterday while it was doing the backup so I had to shutdown the computer. After that I restarted the backup and I noticed that it took a long time for verifying backend data so I interrupted it by killing the Duplicati.GUI.TrayIcon.exe process in the task manager because it didn’t stop when I tried to interrupt it in the browser. Next I tried “Verify files” in “Advanced” and also “Repair” in “Database”. When I started the backup again it got into the “Verifying backend data” mode after a minute again. So I left it over night and after several hours the status was the same. I have read in the forum, that some users had to wait for a long time to finish this process, but what makes me doubtful is that when I look at the taskmanager while duplicati is in the “Verifying backend data” mode, the both harddisks for destination and source data and also the OS-system harddisk are continuous idle. Only the CPU ist constantly working with 30%.

So is it worth to let duplicati work over a day or longer while only the CPU is working? Or is it possible that the verifying process got stuck?

The version before was 2.0.2.1 now I tried the update to 2.0.4.5 with no results. I also tried the no-backend-verification = true option

Thank you in advance for your help.

Best regards,

Christian

#24

I’d recommend checking the About -> System info page and scrolling down to the bottom section where you should see a “lastPgEvent” block. Watch that for changes to see if Duplicati is still working on things.

#25

Did you find a solution to your problem? I’m experiencing your exact same issue, and I’m out of options on what to do next.

Edit: My job did not actually finish in 4 hours, it is still not there after 6 days… So be patient or start over.

#26

Me too, I’ve got one albeit very slow server that’s been at this for over a week and shows no sign of ever ending. The only good thing is that it’s still going and not simply hung.

#27

So for some reason my backup job suddenly takes 3,5 hours to complete, instead of the regular 30 mins it used to take. Good that it does complete, but for my use, 3,5 hours is impractically long.

#28

yes, after the crash i did the backup and it used several hours. I also checked the About -> System -> “lastPgEvent” block which was always updating. After that the backup worked as usual.

So if you have a crash during a backup, you must be patient for the next try.

#29

Several hours is fine, but over week isn’t - I thought it was related to being a Wasabi storage backup but another server has decided to do the same and it’s a local SMB storage backup. It’s also not been failing so why it’s decided to do this I have no clue.

lastPgEvent : {"BackupID":"3","TaskID":5,"BackendAction":"Put","BackendPath":"ParallelUpload","BackendFileSize":943945,"BackendFileProgress":943945,"BackendSpeed":9,"BackendIsBlocking":false,"CurrentFilename":null,"CurrentFilesize":0,"CurrentFileoffset":0,"Phase":"Backup_PreBackupVerify","OverallProgress":0,"ProcessedFileCount":0,"ProcessedFileSize":0,"TotalFileCount":118729,"TotalFileSize":825545822534,"StillCounting":false}

#30

I was forced to reboot the second server, because Windows, and it was doing nothing else other than the verifying backend of the SMB stored backup. On restart it’s now doing the same but for the main Wasabi stored back up job.

What is going on? Anyone? Please?

**The cause should be fixed in 2.0.4.17_canary_2019-04-11 although backups will still need to recreate and upload any that got missed, but they should no longer get missed.

#31

A third machine, this time a Windows 10 PC, has started doing the same. Another where backups have been 100% for weeks.

FYI, currently running 2.0.4.16_canary_2019-03-28 on all systems.

More digging around and I discovered this: Backups appear to be missing some index files · Issue #3703 · duplicati/duplicati · GitHub

** The cause should be fixed in 2.0.4.17_canary_2019-04-11 although backups will still need to recreate and upload any that got missed, but they should no longer get missed.

#32

What at least helped in my case, was stopping the backup, stopping the service, opening the database with an SQLITE editor (e.g. DB Browser for SQL Lite) and running the sql command ANALYZE.

Afterwards all SELECTS and stuff were running in a reasonable time.
This might also be a hint for the developers how to fix this. I had the same issue also with recreating a database (Queries running sometimes for 3-4 days). Maybe this could generally solve the slow access on the database.

3 Likes
Verifying Backend Data
UsageReport at end of backup very slow
Hangs while trying to "Re-creating missing index file"
#33

Firstly, welcome to the forums

Secondly, thank-you oh so much, you’ve helped me fix the two machines I’ve been waiting nearly two weeks to complete their verifications. Both completed within a few hours after running the ANALYZE on their respective databases. I could instantly tell it was different as each was uploading new files every few seconds instead of one or two per hour.

Would be nice to see this added to Duplicati as a native action.

#34

GitHub issue created here:

2 Likes
#35

Hopefully this will help mine too!

The other bottleneck seems to be CPU usage, when “Verifying backend data …” and putting .dindex.zip.aes files on to the remote storage it is maxing out one CPU core only. I assume it would be a mammoth task to get this multithreaded?