Recently I’ve been getting errors with the database. So I closed Duplicati, copied a version of the DB, then started running a database recovery. After a week or so, it was only partially done, so it would seem it would take months to complete which doesn’t work for me.
I do have an older database backup from May 2023. Is there any chance of using this as a base and then perhaps upgrade from there? Or should I just cut my losses and start a brand new backup.
You can try to grab the test build for next Canary from here
It has a fix for this problem. Note that as for a large database with serious damage, this can’t be fast, but hopefully could be less than 36 hours (that was the time needed for the 2 last persons having this problem here). After rebuild, some cleaning of invalid data could be also necessary (with purge-broken-files for example)
Note that this build was done 2 days ago and these builds expire in 3 days, so don’t delay before downloading it.
I’d not recommend to try that, could lead to data loss on the backend and more time wasted.
Yeah, there didn’t seem to be any speed increase in the newer version. I started it just before christmas and it just about finished a few days ago, then we had a power failure and it failed.
Ultimately its my own fault, I felt invincible and so had 3 years worth of daily backups (which I realize now is a little overkill). So its quicker to start the backup from scratch, but this time I have a more thought out retention policy in place which hopefully will help.
it’s unfortunate that you did not post about that while it happened since now it’s done there is no way to investigate why this not fixed it for you. You did not produce a trace file by any chance ?
Yeah sorry, the backup failed due to a power failure and rather than try and do it again I decided to start new. Never thought to keep a log as Duplicati itself seem to run fine (just long).
That might be blending two things. The blocksize default is 100 KB. There’s a request to make it 1 MB.
The dblock-size is for a file of blocks, and usually set via “Remote volume size”. That default is 50 MB. Choosing sizes in Duplicati covers this, but the Options GUI only highlights volume size, not blocksize.
If you’re starting a new backup, it would be a good chance to scale blocksize up by, say, a factor of 20.