I’m currently running Duplicati as a Docker container in unRAID. I think I’ve exhausted pretty much every option when it comes to unRAID forums and have not been able to get a clear answer on this.
Duplicati is slow for me–very slow. I’m backing up to a USB drive and the actual transfer of the zip files to the drive seems to be fine. It’s the generation of the .zip files that is happening at a very slow rate of about 5-10 MB/s. Playing around with some settings made a little improvement, but not by a large margin. I finally did my initial backup with the following settings. Anything not listed was left at default.
Thread Priority: high
Block size: 500KB
DBlock size: 250MB
Compression is default. I didn’t notice any difference with it on or off.
The initial backup of 1.7 TB took over two days with these settings. Now that the full backup is complete and I’m just running incrementals, the duration is a more bearable 15-30 minutes a night–maybe 45 minutes assuming there’s been 5-10 GB worth of additions/changes. I can live with this. However, there’s a new issue that has me worried.
Browsing through the backed up files is painfully slow. Every attempt to expand a folder takes two to three minutes. Just trying to test restore a file can take me up to 15 minutes drilling down through the folders just to get to it. This seems to get longer and longer with every new backup taken. I’m afraid it’s going to get to the point where it’s impossible to browse through them and I’ll have a backup that I can’t restore from.
Is any of this normal? If not, is there anything I could be doing wrong that’s causing this?