I just installed Duplicati on Friday to 5 servers backing up to OD4B. Few issues. Running 22.214.171.124_canary_2018-04-02 on both Windows & Ubuntu (headless setup)
Duplicati takes an average of 1 hour to backup on each server with very little changes. I have a 400GB still on it’s initial backup going on 14 hours. 32gb ram, gig ethernet. Borg can back up within just a few minutes, with rclone taking another few minutes. Is this normal? I do notice Duplicati has much better compression, but should the backup really take this long? My smaller servers of 2-3GB even take an hour. Is there some setting I have causing this? In comparison, Borg takes like 45 seconds to complete on the smaller servers. The backups are of Ubuntu and Windows Server 2016 VM’s.
Receiving error of the following after backup finishes. Is this a bug or am I missing something?
2018-04-09 01:37:46 -04 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20180408T192200Z.dlist.zip.aes,
2018-04-09 01:53:35 -04 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-i8d36a5f5f18a49ad8dbb26acedfc3374.dindex.zip.aes,
2018-04-09 02:09:25 -04 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-b2dad2dfb795f452288c946c884135de6.dblock.zip.aes
Love the software, hope someone can provide some tips. Thank you!
Performance expectations are a tricky thing to manage due to all the things that can very from system to system and configuration to configuration. Are you backing up the raw VM image files or is Duplicati running inside the VM?
For the “Failed to process file” error I suspect this is happening during the verification step that checks a set (dlist, dindex, and dblock) of destination files at the beginning of each backup run. My guess is the backups are still running, just throwing this error at you when they finish.
To confirm this, can you both try adding the --no-backend-verification=true advanced parameter value to your backup job?
If this flag is set, the local database is not compared to the remote filelist on startup. The intended usage for this option is to work correctly in cases where the filelisting is broken or unavailable.
Default value: “false”
Note that this is a test and NOT a solution as it basically TURNS OFF some of the tests that Duplicati does to make sure you backups are viable. The actual cause of the issue could be something as simple as the temp folder to which Duplicati is trying to save the files is full, maybe Duplicati doesn’t have file read rights to the destination, etc.
So with --no-backend-verification enabled the job runs without problem?
Then we should see if there’s more detail about the “Failed to process file duplicati-”. What I suspect is happening is that during the verification step when Duplicati downloads a set of files (dblock, dlist, and dindex) to be tested something like the following is happening:
the download is failing (so nothing to test)
the download results in a corrupted file (so test fails, but this should result in a different error)
the download works but before the test can be conducted the files are deleted (perhaps disk cleanup or antivirus is removing the files)
Did you try the --no-backend-verification parameter mentioned above?
A more targeted test would be to try --backup-test-samples=0.
Again, these are fixes - just tests to help isolate exactly where the failure is happening.
After a backup is completed, some files are selected for verification on the remote backend. Use this option to change how many. If this value is set to 0 or the option --no-backend-verification is set, no remote files are verified
Default value: “1”
Since the --no-backend-verification disables checks that happen at the start AND end of the backup while --backup-test-samples=0 disables just the end tests you’ve confirmed the issue is with the end tests.
Can you post the actual error message you’re getting when you don’t use either of those parameters?
I updated to recent duplicati-version, but no change, still the same error: Errors: [ 2018-06-20 02:46:37 +00 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20180616T020001Z.dlist.zip.aes, 2018-06-20 03:20:23 +00 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-i79e03544a41e4efc8b28d039cb6e6db6.dindex.zip.aes, 2018-06-20 03:54:09 +00 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-bd0adde35b66b4b4289b85ee5560782a2.dblock.zip.aes ]
Do you need more log? I couldn’t export the text properly (only without cr/lf).
After the last working version, 126.96.36.199 and the problematic (slow backup and warnings) 188.8.131.52 I tried 184.108.40.206 now. Result: Still 100s of thousands of warnings (I guess one per file), also after rebuild of database.
Rather slowish backup. A backup to Google Drive, where there have been no changes at all, took 40 minutes where it normally, I don’t remember exactly, should take less than 10 minutes.
Thanks. It looks like you did mention it but only as “new version” so knowing which one is good.
With 220.127.116.11 does --backup-test-samples=0 still stop the errors and allow a backup to run?
If you what up in 18.104.22.168 again, try setting the test samples to 0. I suspect it won’t help for you as it did for twobi since you’re getting warnings with backups that run while I believe he’s getting errors with backups that fail. So most likely you’re running into a different issue.
If you do re-update, consider starting a new topic and include the text of the error messages you are getting.