Erro de backup - onedrive

Boa tarde.
Tenho um backup sendo feito em nuvem. Ele foi concluído com sucesso a primeira vez, porém quando coloco para ser executado pela segunda vez, dá erro dizendo: “falha ao processar arquivo duplicati”. Assim, não é realizado o backup.
Alguém poderia me ajudar?

Good afternoon
There is a backup being made in the cloud. The first was successfully completed the first time, when it was put to run a second time, giving the following statement: “fail duplicator processor”. Thus, the backup is not performed.
can anybody help me?

http://sonda.me/file/9cb0b7d59c07ca6c8a7f04d4d646a008

Suspect is that the seeming change of remote volume size to 500MB (up from default 50MB) means the download can’t complete in the default 100 seconds that Duplicati allows for downloads from OneDrive. Actually it’s a Microsoft default, but Duplicati doesn’t change it. You can probably see Duplicati server log About --> Show log --> Retry towards the end of the backup when the verifying backend files step occurs. This samples a dlist, dindex, and dblock for integrity, and you had several backups fail on the large dblock. Looking at the live log should show retries every 110 seconds or so for about 4 tries, then it will give up…

If this is what’s going on, add an –http-operation-timeout at something long enough to get remote volume.

Set remote volume size and http-operation-timeout automatically based on bandwidth #3819 is a request for Duplicati to help the user set things up suitably. I have a feeling most destinations aren’t as sensitive to slow transfers as OneDrive, in fact I suspect some don’t time out. That might harm ability to notice hangs.

Setting timeouts not-too-short but not-too-long is hard. Anyway, I hope your resolution is just a new option.

Então, se eu redefinir o backup para 50MB, será que haverá uma solução em aumentar o tempo?

So if I reset the backup to 50MB, will there be a solution in increasing the time?

It all depends on how fast your Internet connection can download. Any ideas? Can you run a speed test?
Certainly making it 50MB will help. You need maybe about 50 Mbits/second to do 500MB in 100 seconds.

Choosing sizes in Duplicati gives guidance on sizes. Sometimes larger is better. If so, raise the timeout.

The other reason is that Remote Volume Size applies only as volumes are initially created, so large ones would continue to time out unless you arrange to have automatic or manual compact remake the volume.