When I now try to run the Backup after I always get the Error:
Failed: Found 1 files that are missing from the remote storage, please run repair
Details: Duplicati.Library.Interface.UserInformationException: Found 1 files that are missing from the remote storage, please run repair
bei Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
bei Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
bei Duplicati.Library.Main.Operation.BackupHandler.Run(String sources, IFilter filter)
bei Duplicati.Library.Main.Controller.<>c__DisplayClass17_0.<Backup>b__0(BackupResults result)
bei Duplicati.Library.Main.Controller.RunAction[T](T result, String& paths, IFilter& filter, Action`1 method)
Is there a way to solve the Problem?
I am running on - 126.96.36.199_beta_2018-04-02
I couldn’t find my original post so I tested the database Repair process again (using 188.8.131.52 canary) on a very small (200M source) backup with the following results:
Deleting all 17 .dindex files and running REPAIR resulted in 17 fresh (dated today) .dindex files being created in just over 1 minute
Deleting all 18 .dlist files and running REPAIR resulted in 18 fresh (dated today) .dlist files being created in just over 1 minute (though ONE file was flagged with a warning of being uploaded with size X but should be size Y)
Deleting all 17 .dindex AND all 18 .dlist files and running REPAIR resulted in 35 fresh (dated today) files being created in 1.5 minutes
So I’d say yes - at least with 184.108.40.206 it is safe to delete all .dindex and/or .dlist files as they can be recreated from the local database ASSUMING the local database is in good condition.
My GUESS is that even if a user has no local database it would still be safe to do the tests I just did, they would simply take longer as all the .dblock files would have to be downloaded and parsed to recreate the missing .dlist and/or .dindex files.
Let me know if you’d like me to test that as well.
I deleted all dlist files and tried to run the repair again. Since more than 24 hours no changes - the progressbar is on 10% and says Starting…
I dont know how to speed up the process - maybe something went wrong and I should cancel and retry :-/
Hi again, I canceled the repair (no progress after hours…). Now I deleted again all .dlist and .dindex files and started the process again this morning. For the moment no update at the progressbar, but I see in the destination - it is still creating .dindex files.
I think the problem comes maybe from several reasons
my backup is about 250GB
for the .dblock files I defined only a size of 5MB (bigger sizes didnt work with the poor internet connection I have)
just to give you an update… The repair process created .dindex files again and this seems finished since yesterday morning (no new .dindex files are arriving at the destination). The progressbar is still on 0% and I dont know is the Job doing something in background in order to create the .dlist files or is it crashed. For the moment I keep waiting…
no progress since my last posting (no new files at the destination and no update of the progressbar or any new Logentries) - thats why I stopped it now. It seems the exisitng backup is lost. :-/ I create now new backup-jobs and split it into several parts - in that case I will not loose the entire backup in case of future problems. I am not happy with the situation it seems the actual version isn’t reliable in my case.
Sorry to hear about all the troubles (and taking so long to get back to you).
You’re doing everything that normally works so my guess is your internet connection is triggering a (usually very transient) bug in Duplicati where and interrupted transfer causes Duplicati to wait forever for it to finish.
I’m not sure if @kenkendk has looked into this yet or not, but my guess is even if he has it’s a tough issue to track down.
Normally I’d suggest trying your backup to a local drive for a while to confirm it’s stable with the Internet out of the picture but it sounds like your collection is likely to improve so other than waiting who-knows-how-long for a fix there may not be a solution for this issue.
If you decide to move on from Duplicati you might have better luck with Duplicaci.
The error is indicating that you are missing a dblock file so the backup no longer has the hashed data blocks that existed in that file. This file cannot be recreated, you have to remove the dblock file from the database/dindex files which will also make the files that used that dblock unrecoverable. You need to use the command list-broken-files to show you what files used the missing dblock and are no longer recoverable, then run purge-broken-files to remove the dblock reference completely. Here is a post with some information about list-broken-files and purge-broken-files: