I noticed about a week ago that my backup had been failing for the past month or so due to a file size mismatch on a single file. I followed the recommendation to restore but the database restore failed (I can’t remember the error message that was given). I went to recreate the database, but the process has been taking an unusually long time (seemed to be at 70% for about a week, and then Windows Update decided it was time to reboot)
I have a version of the database from the last successful backup. Would I be able to use that in some way to ease the repair/recreate process?
Alternatively, could I drop that old database in and rename the offending file to something else so Duplicati stops hiccuping on it?
edit: I believe the old database version is from Duplicati 2.1.0.5, which was also where the last successful backup was from.
My main concern here is that at the current state of the database recreate, it’ll process a 50.11MiB blocklist file (the last message I’ve gotten on that was Pass 1 of 3, processing blocklist volume 12 of 1923…), do about 300GB or so of writes to my temp directory (note, my pagefile exists on the same drive so some of those writes may have been that), and a couple hours later start the next block. At this rate I should be done recreating the database in about half a year, which feels very slow, and I don’t want to keep my computer constantly running for that long. To say nothing about apparently needing multiple bytes per bit of the blockfile in order to process it.
Duplicati Version: 2.2.0.3_stable_2026-01-26
Source: 6.69TiB
Destination: 8.8TiB