I have started a very large backup approx. 1.9tb which based on initial transfer speeds should have taken about 5 days. I probably should have broken this out to chunks but I did not. With 200gb to go it has slowed to 3.4ish mb/s transfers. I am hoping that restarting my computer will speed this back up (and fix some other issues I am having) but I am worried about ruining the backup. If I stop the back up fully and restart will duplicati pick up where it left off? Also, what would happen if I restarted my computer mid back up without warning, for reference? Thanks all.
Welcome to the forum @Nika
Did you increase blocksize? The default 100 KB is good for about 100 GB before slowdowns begin due to too many blocks to process. For your 2 TB, multiplying by 20 (so 2 MB) may be good if size will hold.
Unfortunately this means a fresh backup, but maybe it won’t be slowing down as much towards its end.
Answering questions anyway:
I doubt it, but it’s possible.
Too vague for a comment.
It’s supposed to, and especially likely if you stop it nicely using Stop button.
That waits for work in progress to end, and records situation to destination.
Restarting the computer underneath it is often OK but creates greater risks.
Interrupting the backup forces next backup to cleanup from where end was.
This might mean more work in general, but ideally it picks up from that end.
Especially if you backup contains multiple data types, then it might be smart to split it into several separate backups. That’s what I’ve done to radically improve performance and de-duplication, when running backups from ~20 TB data volumes (with multiple data types).
Afaik, based on my experience, it doesn’t hurt to use huge blocks, if the data is already known to be unique, so changes of getting any de-dupe hits is already in probability range of a hash collision.