Show Unfiltered Log in Duplicati interface

A lot of mine are just test jobs, some intended to fail. It’s odd. More normal is redundant backups, or sometimes split backups rather than one huge one. Very large backups get slow at default settings.
Increasing blocksize helps a lot, otherwise the 100 KiB blocks turn into too much tracking overhead.

Can you provide any error messages? “unexpected program termination” never occurs in the forum.
Or do you mean that the logfile is logging job normally, but then just stops and the next backup runs?

Any idea how slow? If you mean they run IP phones, it must have some capacity, yet configuration is

1k what? This sounds like throttle-upload. You mean you have it set like below and it starves phones?

image

Throttling is buggy (in 2.0.5.1 upload throttle throttles download too - fixed in Canary), but not that bad.

Phones quality might get a little worse with some traffic (due to jitter), but starving their other services?
Do you have access to systems to see what Task Manager or some other tool thinks Duplicati sends?

Pause and unpause duplicati at certain hours is a support topic wanting that. I can’t find feature request though I didn’t look in GitHub Issues, which also collects them. There are other schedule-power wishes.

If you mean you have a fat pipe. unlike theirs, maybe it’d be an easier place to start. What’s gone wrong?

sort of implies that jobs that aren’t larger size work, so I’m confused that it also sounds like nothing works.

Can you clarify?