How to speed up file recovery?

While travelling for work I happened to need to recover a few files I did not have ready access to. It was 12 files at around 900kb stored on a backup hosted by Dropbox and Duplicati started recreating the database. At first I thought it was stuck but then I realized that the progress bar was just moving absurdly slow. After about two hours, it had only progressed about 3-4%.

I am assuming that because the backup was encrypted, Duplicati has to decrypt the entire thing (~90 GB) before accessing anything? Perhaps Duplicati is only intended to recover entire swathes at a time, if so it seems impractical to offer a granular option if it’s going to take basically the same amount of time. Is there any way to speed this up? Which options can I tweak? Currently it’s set to back up about 90GB and update every hour and it presumably does its job well. I’ve never tried to recover anything until today.

Hi @bouchacha, welcome to the forum!

Sorry to hear you’re having issues with the restore performance. Duplicati can restore individual files in a timely manner, but it relies on certain things to be able to do so otherwise it falls back to slower methods.

What version of Duplicati are you using on what OS?

What is your dblock (Upload volume) size set at?

Are your restoring through the job (which should use they job database) or direct from the backup (which builds a temporary database)?

Thanks for the welcome! I am using version 2.0.3.3_beta_2018-04-02 on Windows 10 64bit. As for dblock, I don’t know because I am not at my home computer. I definitely did not change any of the default settings. As for your last question, no idea. I am clicking restore on the left hand side of the web interface and then selecting the specific file I wish to restore. It then says “Building partial temporary database …”. I am not aware of any other method of restoring.

After you click restore on the left hand side of the web interface, do you get several restore options? If so, which one do you select? Restoring files from a backup has a picture. The “Direct restore from backup files” would do the partial temporary database, e.g. for a replaced hard drive that had lost the regular local database. For other cases, restoring through the job (either starting at Restore then finding the job, or vice versa) should run faster.

Or is your situation something like you didn’t have your backed up computer, so borrowed one to grab the files? Duplicati currently works best if recovery is from the computer that took the backup. Travel sometimes interferes.

That’s correct, I do not have my regular computer so I’m using another. I therefore did not even see the backup configuration file compiled by Duplicati. I instead directed Duplicati to look in my Dropbox App folder (where I backed up my regular computer) and fish out a specific file from the backup archives. It was able to connect and read the backup with no issue, the only problem was with the very slow speed of recovery for only 900kb worth of data.

OK - that explains the slow recovery.

Using defaults you’d have 50MB dblock (data) files and I think 10kb block (file chunks) so your 900kb file has been split into 90 x 10kb chunks.

If you’re lucky, all 90 of them are in a single 50MB block file so only 50MB will need to be downloaded and processed. If you’re unlucking then they could be split across 90 different dblock files meaning you’d have to download 4,500MB (90 x 50MB) to get to your 900k.

Most likely, if the file hasn’t changed much you’re only looking at 1 or 2 dblocks (so 50-100MB) needing to be downloaded.

Unfortunately, for Duplicati to figure out WHICH ones need to be downloaded it has to recreate part of what would normally be read from the local database it usually has. That partial rebuild step is fairly slow.

We’re working on improvements in that area, but they’re not ready for use yet so unfortunately I don’t know that there’s any way to speed it up in your specific scenario.

If you’re looking to use Duplicati for this sort of thing in the future then your best be is likely to split your backup into smaller jobs so the rebuild of any specific one shouldn’t be as long. Sorry I don’t have anything better to offer at this time. :frowning:

1 Like

I appreciate the very thorough explanation JonMike! I also appreciate the (free) work you’ve put in in what is otherwise an excellent tool (albeit a bit obtuse at times). I’m going to try and split up my backups as that seems like a good idea anyway.

1 Like