its a couple of years old (May 17) but the writer was evaluating various backup solutions and noted that if lost of the database then restores on non standard block sizes could fail.
This is EXACTLY my use case - so very concerning! My setup is basically my photo/video collection over my life - about 300gb (lots of baby videos haha) which I backup with 800mb blocks (just to avoid having huge folder with lots .aes files) and the purpose is that in a hard-disk-has-died scenario, I can get my (irreplaceable!) photos
Wondering if anyone encounter similar problems as person described and/or similar setup and actually went through restores and thoughts about how to make robust?
Seems like nobody who’s stopped by (not everybody follows the forum) is reporting current problems. Possibly the best thing to do is to try your own restore from either your actual backup or a similar test.
Testing recovery from hard-disk-has-died is a good idea. Sometimes people get surprised at how slow database recreation can run. Doing a direct restore to another computer helps ensure that works right. Database recreation actually got a speed-up recently, but it’s still in canary, not in any beta release yet.
Choosing sizes in Duplicati is possibly worth reviewing. Your 800 MB sounds like –dblock-size, as from “Remote volume size” in the GUI, and called “Upload volume size” in the manual. This size varies a lot, especially on the last volume of the backup, and Duplicati should handle whatever the backup job made.
–blocksize which defaults to 100 KB is the size that has to stay stable. Choose per above, then leave it.
EDIT 2: Using Advanced Options --blocksize=500KB on screen 2 of the direct restore sequence fixed it. Your --blocksize might be different, or maybe you’re at default, and a larger remove volume won’t matter.
looks like the best way to stay out of trouble - i ended up exporting the configuration files and putting them on my dropbox (and with their fairly strong recovery capabilities - I am not worried about losing it) and thereafter I can use the duplicati restore-from-configuration file function.
tested and do get some warnings but nothing fatal - so happy enough to not dig in further to the parameters.
That being said, this approach does need a database rebuild which seems to be an extremely time costly step - i thought I saw elsewhere, is there an option to backup database as well?
I am running everything on the canary versions - the database step is still really painfully slow. Too bad I am not an real programmer (and I have a day job); I wouldn’t mind supporting some dev efforts in building in for the database backup as an option.