Problems restoring with Duplicati on Windows 10

Hi, I’m new to Duplicati, but am having problems. _ have just installed Duplicati - on Windows 10 Home x64 and am backup up some sample files to a Ubuntu PC on the local network using SFTP Protocol. The backups go fine and I can see the backed up files when I go to perform a Restore, The files get restored properly, but the progress bar never fills up, it just sits there saying “Downloading Files…”

Any suggestions?

Patience? I experienced this myself with an SFTP restore just today. The files I wanted restored showed up very quickly even though the restore status bar showed “fetching blocks” or “downloading files” for quite a while afterwards.

Eventually the restore “finished” and I got donation request message with a “Done” button.

I haven’t tested yet but my GUESS is Duplicati may be pre-allocating space to make sure there’s room for the full restore so it may only SEEM like the restore is done just because the files (with sizes) seem there.

Hm… interesting advice, the file I was restoring was a 94Kb document, and I have it a good 5mins to restore it on a gigabit LAN.

How long should I expect to have to wait? Is it this way when using protocols othe the. SFTP?

I’m not sure.

I re-did my test on an SFTP backup (296MB backup from a 728MB source with subfolders restored to a custom location drive root with “Restore read/write permissions” enabled) and what I experienced was:

  • "Starting … (4 seconds)
  • “Scanning existing files …” (11 seconds)
  • “Scanning for local blocks …” (15 seconds - files appear to have already been restored??? Is it just copying the locally found source because it hasn’t changed since the last backup?)
  • “Downloading files …” (350 seconds)
  • “Verifying restored files …” (18 seconds)

Total restore time ~7 min. (mostly spend downloading files).

I’ll see if I can set up another test where the source files are no longer available…

doing a restore again I see duplicate is downloading what appears to be the entire backup set in order to deliver a single file restore. I guess this makes sense as there is no logic on the server…
this could be a major limiting factor… I have around a quarter terabyte to protect…

If restoring from a job that has a local database (the “normal” scenario) then Duplicati should be using the db to figure out and download only the archive (dblock) files needed to restore the requested file(s).

So if your --dblock-size is 50MB (default) and your --block-size and is 100KB (default) and you want to restore a file that is 100KB or smaller, a single 50MB archive file wound have to be downloaded.

If your dblock (archive) size is 250MB then that same 100KB (or less) for restore would require a 250MB download.

What can get expensive is when you want to restore a file larger than the block size (default 100KB). If you have a 1MB file to restore then Duplicati has to fetch all 10 x 100KB blocks to reassemble the file.

If there are no revisions to the file and it’s not duplicated, it’s likely only one (maybe two) dblock archive files will have to be downloaded.

But if there are multiple revisions on multiple parts (or shares blocks with other files) of the file be restored then the blocks could be “fragmented” across multiple archive dblock files meaning more would need to be downloaded.

So in our 1MB example file with 100KB blocks stored in 50MB dblocks (archives) the actual downloads could be anywhere between 50MB (all 10 chunks in 1 dblock) and 500MB (1 chunk in each of 10 different dblocks).

Total downloads for a single file restore (with local database) should always be between 1 and the size of the file you are restoring divided by your block size (default 100KB) dblock archive files. Multiply that by your block size (default 50MB) and you should get approximate size.

I suppose with the local database we should be able to figure out how many blocks are need for the restore so it might be nice if the GUI let the user know estimated download requirements…

Thanks, this matches my thinking after my last post. I was looking through the settings and realised that I had changed the block size (I forget the specific setting) I’ve wiped my historical backups and started again.

Testing continues before rollout

Thanks &or your help.

That sounds a bit much. If you have the local database, Duplicati will figure out what dblock files it needs to recreate the file. If the file is smaller than the blocksize (100kb default) it will need to download just a single dblock file.