Remote files missing... How to fix this error?

I’m a new Duplicati user and my initial backup just finished uploading to BackBlaze B2 (~650GB). After the initial upload was complete, I made Duplicati run as a Windows Service and enabled the Volume Shadow Copy support because it kept complaining it couldn’t back up a few files that were always in use.

Somewhere along the way of making these configuration changes I seemed to mess something up. Now whenever I attempt a backup I get an error message informing me that 7 remote files are missing and to run restore. I’ve tried the database restore, delete, and recreate options multiple times but nothing seems to resolve it and I keep getting the error about the missing remote files. I’ve looked at my BackBlaze files and the ones in question do indeed appear to be missing.

Is there anyway to fix this (aside from deleting the entire backup and starting over)? It seems silly that it can’t just re-upload whatever files are missing.

Anyway, here’s the log output after running the database “recreate” option:

MainOperation: Repair
ParsedResult: Success
EndTime: 5/7/2018 7:47:04 PM (1525747624)
BeginTime: 5/7/2018 6:33:01 PM (1525743181)
Duration: 01:14:02.7393369
    RemoteCalls: 13238
    BytesUploaded: 0
    BytesDownloaded: 605115009
    FilesUploaded: 0
    FilesDownloaded: 13237
    FilesDeleted: 0
    FoldersCreated: 0
    RetryAttempts: 0
    UnknownFileSize: 0
    UnknownFileCount: 0
    KnownFileCount: 0
    KnownFileSize: 0
    LastBackupDate: 1/1/0001 12:00:00 AM (-62135568000)
    BackupListCount: 0
    TotalQuotaSpace: 0
    FreeQuotaSpace: 0
    AssignedQuotaSpace: 0
    ReportedQuotaError: False
    ReportedQuotaWarning: False
    ParsedResult: Success

ParsedResult: Error
EndTime: 5/7/2018 7:47:06 PM (1525747626)
BeginTime: 5/7/2018 6:33:01 PM (1525743181)
Duration: 01:14:04.5629368
Messages: [
Rebuild database started, downloading 6 filelists,
Filelists restored, downloading 13231 index files,
Recreate completed, verifying the database consistency,
Recreate completed, and consistency checks completed, marking database as complete
Warnings: []
Errors: [
Remote file referenced as, but not found in list, registering a missing remote file,
Remote file referenced as, but not found in list, registering a missing remote file,
Remote file referenced as, but not found in list, registering a missing remote file,
Remote file referenced as, but not found in list, registering a missing remote file,
Remote file referenced as, but not found in list, registering a missing remote file,


Hi @RichardDavies, welcome to the forum!

Unfortunately, your missing files appear to be dblock files (the ones where your actual backup contents are stored). The only way they could potentially be re-uploaded would be if your local data for those files hasn’t changed since the backup was made. And even then, Duplicati doesn’t currently have a feature to go and check for that.

By “database restore” I’m assuming you mean “repair” (correct me if I’m wrong).

Normally I would expect a database recreate to resolve this issue. Hopefully @Pectojin or @kenkendk can correct me if I’m wrong but I’m wondering if if the dblock files somehow got deleted but are still referenced in the dindex files. Since the database recreate only downloaded index files (some still referencing these now-missing dblock files) the user gets stuck in a loop.

If that’s the case, then the only was I can think of to get out of it are to try a recreate with no index files (which would require downloading the entire set of dblock files) OR try purging the files affected by the 7 missing dblock files.

The purge process is described here (Disaster Recovery - Duplicati 2 User's Manual) but basically comes down to:

  1. Use --list-broken-files to have Duplicati list all files that are “broken” due to the missing dblocks so you can know what is about to be removed from your backups
  2. Use --purge-broken-files to have Duplicati actually remove those files from your backup thus no longer needing the (missing) dblocks

Note that your current backup is not useless - it can still be restored from (other than file versions affected by the 7 missing dblocks).

And while I wouldn’t recommend this as a “fix” you should be able to continue your backups by setting --no-backend-verification=true - essentially telling Duplicati not to check for any missing files when it runs (meaning your existing bad versions are still bad and if any more dblock files go missing you won’t know about it).

If this flag is set, the local database is not compared to the remote filelist on startup. The intended usage for this option is to work correctly in cases where the filelisting is broken or unavailable.
Default value: “false”

Thank you for the reply, and yes, I meant to say “database repair”.

If that’s the case, then the only was I can think of to get out of it are to try a recreate with no index files (which would require downloading the entire set of dblock files)

Can you elaborate on how I would go about doing that? I only know about the “Receate (delete and repair)” button… I’m not sure about how to do it “with no index files” or how to download the dblock files.

I tried the purge process you described but that doesn’t seem to have done anything. Here’s the output of my list-broken files command:

No broken filesets found in database, checking for missing remote files
Listing remote folder …
Marked 7 remote files for deletion
No broken filesets found

And here’s the output of the purge-broken-files command:

No broken filesets found in database, checking for missing remote files
Listing remote folder …
Marked 7 remote files for deletion

Found no broken filesets, but 0 missing remote files

I tried running the backup after that and got the same error about 7 missing remote files. And I also tried another repair without any success.

I just tried this and was able to get it to successfully complete a backup again. After the backup completed, I turned off that setting and was still able to complete another backup successfully so I think I’ve finally been able to work around this issue. Thank you so much for your help!

P.S. On a side note, I still think it’s somewhat ridiculous that the software made it so difficult to work around this problem. Why would you want to prevent future backups from running because of some missing data in the destination? Ideally, it should have just notified me of the missing remote files, but automatically gone ahead and allowed subsequent backups to run (which would re-upload any missing data that still existed on the source PC.)

I’m glad you were able to get things working again!

It does this because it doesn’t re-upload the missing data. The data missing from the destination could have been from an earlier version of a file so it may no longer be available to be re-uploaded.

Since each version of a file potentially relies on blocks backed up in a previous version, it’s possible that a future backup version may not be restorable if an older version is damaged in some way.

While technically it’s likely possible to have Duplicati recognize older version blocks that are “bad” and flag a version as suspect (in terms of restore) thus allowing future backups to re-upload their “base” blocks, it doesn’t help resolve whatever underlying issue caused the bad blocks in the first place.

So Duplicati takes the stance of saying “if your destination is having problems with files going corrupt / missing then you should resolve that problem before running future backups to a suspect destination”.

Granted, it does make this sort of scenario scarier and harder to deal with than it “needs” to be, but it also helps make sure your backups really are as safe as we can make them.

I’m not completely following what you’re saying. I understand what you’re saying that if the destination is missing data and the corresponding file no longer exists on the source that it’s impossible to re-upload that data. Granted there’s not much that can be done about that situation so you just have to cut your losses and accept that the data is gone.

But i guess what I have a problem with was that Duplicati was no longer backing up any data from the source due to a small portion of the destination data being missing. Duplicati should be able to identify what part of the data is missing and inform the user about the situation (so they can look into it), but it should still be able to back up whatever data does still exist on the source (even if it means uploading that data as new/base blocks because it doesn’t yet exist on the destination or the data on the destination for that file is missing…)

Yes, you’re right that doesn’t help resolve whatever underlying issue caused the bad blocks in the first place, but not backing up any new/changed data from the source leaves all that data at risk when there’s no reason why it can’t be backed up. You’re basically saying you’re not going to back up file X because you can’t restore file Y, which doesn’t make logical sense to me.

Anyway, that’s how I see it as someone who only understands backups from a conceptual level and has no in-depth knowledge of how Duplicati actually works under the hood. But hopefully this issue is behind me now.

What a great way of explaining the situation!

As far as I understand it, the main developer of Duplicati (@kenkendk) decided to do it that way on purpose. My GUESS is that was because there hasn’t been time to implement the necessary steps to do as you suggest and “failing” the backup is better than silently eating the error.

Here’s my version of what might be doable when time / resources permit:

  • identify all files affected by the bad blocks
  • stop backing up those files and/or start a new “base version” for them
  • warn the user that there was a problem with CERTAIN files (or versions if a new “base version” was started) and that as far as Duplicati knows it might spread to others
  • keep warning the user until the issue is fixed or those versions are expunged either manually or by retention rules

My motivation is that the backup must run without errors.

If errors occur, these should be fixed at the source instead of working around them. If we apply workarounds, that tends to mask the underlying cause, potentially blowing up at a later point with no clue as to why.

I implemented the purge operation to handle cases with problems, in that you can “rewrite” the backups without problematic files.

I see your point. But continuing to back up data without figuring out why the destination data is missing, seems like a fake service. You could potentially keep uploading for years, only to discover later that no data was really stored at the destination.

Since Duplicati does not know why the files are missing, it assumes there is a problem with the storage and halts. If we ignore the errors, we would potentially re-upload the same data many times over, and not know if it is stored or not.

I prefer to think of it as “We will not back up file X, because we do not know if it will work, since we can see that it failed for file Y”.

1 Like

To clarify, I’m not suggesting that you ignore the error. By all means, you should alert the user to the problem so that they can investigate it and attempt to resolve it if possible.

Except that analogy is too over-simplified. What actually happened was it successfully backed up thousands of files (~600 GB) but encountered a remote error with only a handful of those files (for whatever reason).

My main problem was Duplicati made it way too hard to resolve this issue. I was basically ready to delete my backup and start over because it was no longer backing up any data and I couldn’t find a way to fix it, even though I’m a highly technical person. There’s no way I would have ever figured out how to resolve this issue without your help. And from my perspective, a backup solution that’s not backing up any new/changed data is almost useless.

On the other hand, Duplicati could have made my life so much easier by simply alerting me to the issue and then offering to resolve it by re-uploading the affected files since they still exist on my source computer.

Yes, if this issue continued to persist again and again, I would start to question the reliability of my remote backup destination and start looking for a alternative storage location, but I think this was just an isolated fluke and I would have liked an easier way to resolve this issue.

But fortunately, things are working smoothly now and I appreciate your help. And thank you for listing to my feedback.