How to speed up file recovery?

OK - that explains the slow recovery.

Using defaults you’d have 50MB dblock (data) files and I think 10kb block (file chunks) so your 900kb file has been split into 90 x 10kb chunks.

If you’re lucky, all 90 of them are in a single 50MB block file so only 50MB will need to be downloaded and processed. If you’re unlucking then they could be split across 90 different dblock files meaning you’d have to download 4,500MB (90 x 50MB) to get to your 900k.

Most likely, if the file hasn’t changed much you’re only looking at 1 or 2 dblocks (so 50-100MB) needing to be downloaded.

Unfortunately, for Duplicati to figure out WHICH ones need to be downloaded it has to recreate part of what would normally be read from the local database it usually has. That partial rebuild step is fairly slow.

We’re working on improvements in that area, but they’re not ready for use yet so unfortunately I don’t know that there’s any way to speed it up in your specific scenario.

If you’re looking to use Duplicati for this sort of thing in the future then your best be is likely to split your backup into smaller jobs so the rebuild of any specific one shouldn’t be as long. Sorry I don’t have anything better to offer at this time. :frowning:

1 Like