Boa tarde.
Tenho um backup sendo feito em nuvem. Ele foi concluído com sucesso a primeira vez, porém quando coloco para ser executado pela segunda vez, dá erro dizendo: “falha ao processar arquivo duplicati”. Assim, não é realizado o backup.
Alguém poderia me ajudar?
Good afternoon
There is a backup being made in the cloud. The first was successfully completed the first time, when it was put to run a second time, giving the following statement: “fail duplicator processor”. Thus, the backup is not performed.
can anybody help me?
Suspect is that the seeming change of remote volume size to 500MB (up from default 50MB) means the download can’t complete in the default 100 seconds that Duplicati allows for downloads from OneDrive. Actually it’s a Microsoft default, but Duplicati doesn’t change it. You can probably see Duplicati server log About --> Show log --> Retry towards the end of the backup when the verifying backend files step occurs. This samples a dlist, dindex, and dblock for integrity, and you had several backups fail on the large dblock. Looking at the live log should show retries every 110 seconds or so for about 4 tries, then it will give up…
If this is what’s going on, add an –http-operation-timeout at something long enough to get remote volume.
It all depends on how fast your Internet connection can download. Any ideas? Can you run a speed test?
Certainly making it 50MB will help. You need maybe about 50 Mbits/second to do 500MB in 100 seconds.
The other reason is that Remote Volume Size applies only as volumes are initially created, so large ones would continue to time out unless you arrange to have automatic or manual compact remake the volume.