Checking errors, related to #1400

With the 2.0.4.5 (2.0.4.5_beta_2018-11-28) beta I get the #1400 from time to time, always the problem looks to be the sqllite file. What is the fix?

2019-01-04 09:07:33 +01 - [Error-Duplicati.Library.Main.Database.LocalBackupDatabase-CheckingErrorsForIssue1400]: Checking errors, related to #1400. Unexpected result count: 0, expected 1, hash: nubMcrVRlJ3NhzUfx5eEsZlu9yxD8P9Fzm7K0fjkW4I=, size: 22704128, blocksetid: 251932, ix: 2, fullhash: XU2xahY49ZFnMsParEAl0zlDqaefyEJ3wTPOzGtVOPc=, fullsize: 232419328
2019-01-04 09:07:33 +01 - [Error-Duplicati.Library.Main.Database.LocalBackupDatabase-FoundIssue1400Error]: Found block with ID 256538 and hash nubMcrVRlJ3NhzUfx5eEsZlu9yxD8P9Fzm7K0fjkW4I= and size 22701056
2019-01-04 09:07:33 +01 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: /root/.config/Duplicati/82737979726665657565.sqlite

Are you using snapshots?

I can’t tell it that sqlite file is the one for the current back, but if so I think even snapshots may not help thereb and you night just have to exclude it from it’s own backup.

I wish there was an easy way to do that.

There’s a chance that snapshots will avoid the #1400 error, but a backup of an actively-changing database is a poor candidate for a restore because even if it’s not viewed as corrupted the next run, it will be rather obsolete. People who really intend to backup the database for their main job (e.g. to avoid the hassle of DB recreation, if their drive breaks) sometimes do the backup of one job’s database (find path in Database tab) in a second job. I’m not sure if @rocco is trying to do that, or just backing up a large set of files, inviting Duplicati to trip here…

Well I backup everything, and nothing in the defaults are excluding the sqllite db of the backup so…

Obviously Duplicati should not trip this way, though there are others (such as locked files) that can happen due to OS limitations. For example, locked files might require snapshots in order to be read by some other program.

If you don’t have a specific desire to backup the Duplicati databases, try something like –snapshot-policy=On to workaround Duplicati’s issue. If you do have a specific intent to be be able to restore Duplicati databases, make another job so that you can backup while neither the database nor the destination are in the middle of updates.

Does it make sense to backup Duplicati config and db files? raises that question, but the answer is your choice.