Stuck on Verifying Backend Data [2.0.3.6]

I have the same problem with a backup to Google Drive. The blocksize is 50MB and I have a large database of around 8GB. Duplicati gets stuck for days at the same SQL query (using one CPU core) and I’ve let it run for several times up to a week. A database repair also leads to a crash or doesn’t end in less than a week. In some runs I also saw it upload a block once in a while, but very rarely. It normally just runs that SQL query.

So for me, retrying didn’t help. I even tried to use mono to optimize the duplicati binary (mono --aot=full -O=all ), but it could not compile some methods. Any other ideas ? (I’m using Linux and duplicati 2.0.3.6).

If you downgrade to 2.0.3.5 does the problem go away? A few issues have been found in 2.0.3.6 (including performance related ones) so this might be related.

I downgraded to 2.0.3.5. With the existing database it would still get stuck in the same query. Then I deleted and recreated it and it finally finished to upload. It also successfully verified the remote state. No errors, everything seems fine. I even tried to restore an older file to make sure the backup is useful.

Great!

While I don’t recall specific 2.0.3.6 database errors (though @kenkendk might know of some) it sure looks like there might be a database corruption issue with 2.0.3.6 (whether it’s with normal backups or database maintenance it’s unclear).

I’ve gone ahead and flagged your comment as the solution and updated the title to better reflect it seems to have been a 2.0.3.6 related issue. Please let me know if you disagree.