I would like to add that just to rule out the Microsoft OneDrive side, I run a ~70GB backup with a different app like “Duplicacy” and it completed just fine with no errors.
I really like Duplicati, but sadly is not working with OneDrive!
So looks like there is some kind of Duplicati bug, maybe the Dev’s would like to investigate more ?
Google search of that shows how widespread and poorly understood this is on various different software including Arq Backup and rclone which had it pretty bad case and did a workaround of doing more retries.
This is the first Duplicati report that I can find in the forum or in Duplicati’s GitHub Issues. Some answers on the Internet wonder if it’s related to the specific account. Maybe it’s even what equipment is used for it.
OneDrive code is possibly here, and possibly also more persistent than Duplicati’s (which allows setting).
That code chunk looks like you might be able to see some Duplicacy log entries about any retry activities.
Duplicati retries can be seen at Retry level (e.g. About → Show log → Live → Retry), or the job log shows "RetryAttempts": 5, (for example from my new Backblaze backup) in job log’s Complete log section.
I’m not going to try to talk Duplicacy details, but for Duplicati, see what’s retrying. Normally a backup does list to sanity-check the destination files, then goes into lots of put to upload its dblock and dindex files.
Time per file varies according to network speed and the value set for default 50 MB Remote volume size which is the limit on dblock file size. You might also see if you can get a small, very-default backup going.