I have a daily backup (source file is a MSSQL .bak) monitored through duplicati-monitoring.com and the reports show that size is being increased every day, but when restoring versions later than day 26/07 the restored file is the one from 26/07 version, not from the selected version.
If I restore the version from 24/07 the size is 7.253.323 KB
If I restore the version from 26/07 the size is 7.262.323 KB
If I restore the version from yesterday night 04/08 the size is the same as 26/07 version, 7.262.323 KB. Strange thing is that if I go to the server to check the size of the original .bak from yesterday it is 7.395.443 KB…
What is happening? It’s not backing up as expected or not restoring as it should? When the partial DB recreation finishes and the download process starts I get 5 errors like this one:
2019-08-05 12:21:53 +02 - [Error-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-MissingFileDetected]: Remote file referenced as duplicati-b7355a2c15a254ab28e0d8f3d2777a621.dblock.zip.aes by duplicati-i19332a5fd1084a0ea682a198ffce0c85.dindex.zip.aes, but not found in list, registering a missing remote file
And when the restore finishes I get another two Warnings:
2019-08-05 12:23:00 +02 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MissingVolumesDetected]: Found 5 missing volumes; attempting to replace blocks from existing volumes
Note that the restores are made from another Duplicati instance on another location, I’m testing if the backups are usable.
Any suggestions? I don’t know how to proceed
I suspect the project currently has a shortage of people with deep expertise and volunteer time. We’ll see whether anybody else gets on this. On your end, are you testing or calling for help in a disaster situation?
My initial reaction on seeing this the first time was that missing dblocks are a bad situation, because they aren’t replicated anywhere and they represent actual file data. You can look at these for some more tests:
The AFFECTED command
The LIST-BROKEN-FILES command
To run the commands that don’t have a special GUI, you can use these:
Using the Command line tools from within the Graphical User Interface
Exporting a backup job configuration
If you’re in a hurry to resume backups, you can export the job and import it. Be sure to change destination.
Regardless, try to preserve all logs from current problem, and look for any oddities from prior backup runs. Logs for the backup are in the backup’s sqlite DB, so don’t Recreate the DB without saving a copy off first.
Maybe it will be helpful to turn the DB into a DB bug report, but the privacy protection there gets in the way, while also being slightly imperfect. You don’t happen to have set up –log-file logging in advance, have you? That would be something to consider in the future test to figure out what happened. But default logs help…
You can go use DB Browser for SQLite locally if you have no detailed logs set up, default logs don’t show anything odd, and you don’t want to publish a DB bug report. Look in RemoteVolume table for remote files such as duplicati-b7355a2c15a254ab28e0d8f3d2777a621.dblock.zip.aes that are in questionable “State”. RemoteOperation table for their Path will show when the “put” upload operation was attempted. The “list” operation will have a view of what files were there then. I’m not sure if yours never made it, or got deleted but (as restore attempt found) there are files not there at the moment that ought to be there. Needs trace.