Missing file repair

I have run the UI repair and the command line repair commands, with no luck to redo a backup based on the database, which in turn has missing backup files in the backup store. I basically deleted some of the backup files accidentally. And as it took me a while to send them to a cloud as well, I didn’t want to redo the whole process again. It turns out, I have deleted everything and starting now from scratch again, with the excuse of testing other configurations this time.

The UI is pretty undocumented and unfriendly in this regard. Looking at the rear-mirror, is there something I could have done to get this back on track without that much of an effort?

I just want to test how reliable Duplicati is and make sure it works when needed. And perhaps plan for other external measures I will need to take in order to protect the information.

Regards,
Rodrigo

Hi @Rodrigo_Contreras, welcome to the forum!

I’m not sure I understand what you were trying to do there. Can you explain what you mean?

If it was dblock files that were deleted then there’s nothing you can do to recover those backed up files - they’re gone. You’ll likely get errors about files missing from the backend when trying to backups at which point you can ignore those errors (not recommended) or purge the references from the missing files from the database (recommended).

If it was dindex or dlist fies that were deleted then I think a database repair should be able to regenerate them.

Great idea for any backup solution! In the case of Duplicate, since it’s designed around NOT fully trusting the backend it is fairly robust and can usually recover from most issues. You can even restore from a backup where SOME dblock files have been deleted, though the missing file parts from the backup will either be save as zeros or filled in from local files that have matching hashes.

The web based UI is relatively new and is a huge improvement over the version 1 command-line only usage, however it certainly could be fleshed out and made easier for new users to, um, use.

As an Open Source project feel free to pop on over to GitHub and submit suggestions (or even code updates) on what you think might work better for new users. (As developers it can be difficult to imaging what it’s like for a new users to faced with some of the settings, so fresh eyes can help with that.)

The stuff I mentioned about about purging is probably what you could have done to avoid starting from scratch. If you want to test robustness again I’d suggest move the files to a different folder rather than deleting them… :wink:

Hi Jon

I still have the original files intact, so no damage there. So I was thinking how can Duplicati handle the mistake of deleting dbblock files and redo the work where missing. I saw from other posts that purging sometimes take days and in my case it sounds like best to delete everything and run the whole backup again.

As a separate note, for some reason backing up the whole lot of 500GB is taking days now. I would assume backing up from an external usb drive to an external USB drive will make things really bad. However, the first time I used Duplicati, I gradually increased the number of files being backed up and made things look nice (less dramatic). I now understand the hardware infra will influence the whole process. Is that your opinion as well?

It it’s an initial backup, then that makes sense. If everything has already been processed at least once, then there’s likely something else going on.

Yes, that’s correct. How much of an influence it is depends on your job settings (compression level, encryption, etc.)

I had a similar situation - I fixed it by running list-broken-files and then purge-broken-files, and it resumed.

For reference, this is a large static backup set that gave me “duplicate files exist at backup” after uploading for a week. I verified two files existed with the same name in Google Drive, with what looked like the same path (Google Drive interface seems to hide away information at times), so I naively removed them both (and emptied the trash, as duplicati IIRC still finds them there on google drive). Deleting/repairing/recreating the database does not fix it, despite the actual backup files being unchanged, but purge-broken-files then running the backup does fix it. PSA: obviously rename instead of deleting

Good to hear you resolved your issue the “right” way.

And, yes. Depending on what files are missing from the destination a database repair may not resolve the issue, but purge-broken-files should.

When fiddling with destination data files moving to another folder (even a subfolder) is probably the best way to go. Unless you rename the file so it doesn’t START with the job prefix (default duplicati-) you could end up causing an error about unexpected files being found.