I recently moved my duplicati installation from a MacBook pro (i7-3520M CPU @ 2.90GHz, running Ubuntu 18.04 server) to an older machine I’ve repurposed as a fileserver (Core2 Duo CPU 6400 @ 2.13GHz, also Ubuntu 18.04 server) . The first backup from the new installation has been running for 3 days now, and has only got through about 500 gigabytes. This is much slower than previously (last backup took 5:57 in total according to the UI). It is on " Backup_ProcessingFiles", and reports a speed of 1 byte/s.
I’m trying to figure out if the slow speed is a hardware limitation that will come up every time, or just an issue with this first backup because of some incidental changes.
- file metadata has changed - new owner UID for all files
- The backup used to use two directories as a source. They’ve been merged into one directory source.
As I understand it, this is enough that it will review every file, and update the index with the new location (“delete” the old location and add the new). It isn’t uploading anything, because deduplication catches it all.
I expected this process to go much faster, though. SHA sums aren’t exactly hard to calculate, but I can’t think of what else is tough about this. What’s likely to be the bottleneck?
Here’s a screenshot of system tools - ethernet, disk, cpu, and memory are all largely idle.
(edits for extra info)