In my experience I see something I’ll call double copy… in a least complicated setup.
Backup source D:
Temp Dir C:%temp%
Destination E:
First read files from D:, create dblocks in temp, move to destination
No doubt the job is done but high resources are required. In this case even with no compression and encryption high HDD I/O is needed.
For local storage like you’re using it might work better if you use --tempdir=
to point to somewhere on E: and then --use-move-for-put
to move the resulting files instead of copying them.
If you are using file-based storage, you can set the options --disable-streaming-transfer and --use-move-for-put. The first disables the internal stream handling (no upload/download progress reports) and the second uses the OS “move” instead of “copy”, and thus never reading/writing the file in Duplicati.
2 Likes
kees-z
October 23, 2017, 6:26pm
3
In my experience I see something I’ll call double copy… in a least complicated setup.
Backup source D:
Temp Dir C:%temp%
Destination E:
First read files from D:, create dblocks in temp, move to destination
No doubt the job is done but high resources are required. In this case even with no compression and encryption high HDD I/O is needed.
2 ideas from @kenkendk that may improve performance and reduce disk usage:
Multithreaded backup engine:
Duplicati is currently (mostly) single threaded. I am working on a version that is multithreaded. My guess is that the CPU is too slow on one core, which is why it does not upload faster (there is not enough data).
I have not heard of slow-downs due to old Mono, but you could try the “aFTP” backend, as it implements the transfers a bit different.
Processing DBLOCKS and Temp files in memory:
Not sure, but I would rather fix it by keeping the file “in-memory”, as it is likely to not be much of a problem (say 50-200mb file, and memory is 4GB+ on new laptops).
But it is a bit of a corner case, as the file is temporary (and thus is written at the right place) but it is also used to “communicate” internally, in that it is created and passed to another method. It would be possible to keep the file open, which would prevent a cleaner program from deleting it.
I split this off to it’s own “disk usage” specific topic as fixes for this and CPU usage issues aren’t likely to overlap much.