Slow Backups & Completion Error


#1

Hello,

I just installed Duplicati on Friday to 5 servers backing up to OD4B. Few issues. Running 2.0.3.4_canary_2018-04-02 on both Windows & Ubuntu (headless setup)

  1. Duplicati takes an average of 1 hour to backup on each server with very little changes. I have a 400GB still on it’s initial backup going on 14 hours. 32gb ram, gig ethernet. Borg can back up within just a few minutes, with rclone taking another few minutes. Is this normal? I do notice Duplicati has much better compression, but should the backup really take this long? My smaller servers of 2-3GB even take an hour. Is there some setting I have causing this? In comparison, Borg takes like 45 seconds to complete on the smaller servers. The backups are of Ubuntu and Windows Server 2016 VM’s.

  2. Receiving error of the following after backup finishes. Is this a bug or am I missing something?
    Errors: [
    2018-04-09 01:37:46 -04 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20180408T192200Z.dlist.zip.aes,
    2018-04-09 01:53:35 -04 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-i8d36a5f5f18a49ad8dbb26acedfc3374.dindex.zip.aes,
    2018-04-09 02:09:25 -04 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-b2dad2dfb795f452288c946c884135de6.dblock.zip.aes

Love the software, hope someone can provide some tips. Thank you!


#2

Hello,
I’ve the same error on point 2 on my fresh installation on Windows Server 2008 non-R2,
Some tips?


#3

Hello @Newdup and @acnb, welcome to the forum!

Performance expectations are a tricky thing to manage due to all the things that can very from system to system and configuration to configuration. Are you backing up the raw VM image files or is Duplicati running inside the VM?

For the “Failed to process file” error I suspect this is happening during the verification step that checks a set (dlist, dindex, and dblock) of destination files at the beginning of each backup run. My guess is the backups are still running, just throwing this error at you when they finish.

To confirm this, can you both try adding the --no-backend-verification=true advanced parameter value to your backup job?

--no-backend-verification
If this flag is set, the local database is not compared to the remote filelist on startup. The intended usage for this option is to work correctly in cases where the filelisting is broken or unavailable.
Default value: “false”

Note that this is a test and NOT a solution as it basically TURNS OFF some of the tests that Duplicati does to make sure you backups are viable. The actual cause of the issue could be something as simple as the temp folder to which Duplicati is trying to save the files is full, maybe Duplicati doesn’t have file read rights to the destination, etc.


#4

Hi @JonMikeIV

Same problem here and I just tried --no-backend-verification. Backup process ends without any errors.

Your suggestions about temp folder (enough space here and there) or read rights (write/read rights on both side) doesn’t help me to figure out what’s the problem. Any ideas?


#5

So with --no-backend-verification enabled the job runs without problem?

Then we should see if there’s more detail about the “Failed to process file duplicati-”. What I suspect is happening is that during the verification step when Duplicati downloads a set of files (dblock, dlist, and dindex) to be tested something like the following is happening:

  • the download is failing (so nothing to test)
  • the download results in a corrupted file (so test fails, but this should result in a different error)
  • the download works but before the test can be conducted the files are deleted (perhaps disk cleanup or antivirus is removing the files)

#6

Hi JonMikeIV,
Thanks for your answer long time ago. I just set up a new duplicati to eliminate other dependencies or errors. The error is still the same.

Your suggestions:

the download is failing (so nothing to test)

I get no error, when I try to download a file

the download results in a corrupted file (so test fails, but this should result in a different error)

no other error so far

the download works but before the test can be conducted the files are deleted (perhaps disk cleanup or antivirus is removing the files)

There is no auto cleaning nor a antivirus-check on that drive (NAS for Backup)

Any idea, where I can change something?


#7

Did you try the --no-backend-verification parameter mentioned above?

A more targeted test would be to try --backup-test-samples=0.

Again, these are fixes - just tests to help isolate exactly where the failure is happening.

--backup-test-samples
After a backup is completed, some files are selected for verification on the remote backend. Use this option to change how many. If this value is set to 0 or the option --no-backend-verification is set, no remote files are verified
Default value: “1”


#8

Yes, I tried --no-backend-verufication first. Backup finished without error. Now I have tried with the parameter --backup-test-samples=0 and also backup ended without error.


#9

Thanks for trying both parameters.

Since the --no-backend-verification disables checks that happen at the start AND end of the backup while --backup-test-samples=0 disables just the end tests you’ve confirmed the issue is with the end tests.

Can you post the actual error message you’re getting when you don’t use either of those parameters?


#10

I updated to recent duplicati-version, but no change, still the same error:
Errors: [ 2018-06-20 02:46:37 +00 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20180616T020001Z.dlist.zip.aes, 2018-06-20 03:20:23 +00 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-i79e03544a41e4efc8b28d039cb6e6db6.dindex.zip.aes, 2018-06-20 03:54:09 +00 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-bd0adde35b66b4b4289b85ee5560782a2.dblock.zip.aes ]
Do you need more log? I couldn’t export the text properly (only without cr/lf).


#11

After the last working version, 2.0.3.5 and the problematic (slow backup and warnings) 2.0.3.6 I tried 2.0.3.7 now. Result: Still 100s of thousands of warnings (I guess one per file), also after rebuild of database.

Rather slowish backup. A backup to Google Drive, where there have been no changes at all, took 40 minutes where it normally, I don’t remember exactly, should take less than 10 minutes.

I went back to 2.0.3.5


#12

@twobi, just to confirm - you’re still using 2.0.3.4 (not 2.0.3.5, 2.0.3.6, or 2.0.3.7 like @Tapio has used), right?


#13

No, sorry for not mentioned it: I updated to 2.0.3.7_canary_2018-06-17 before I did my tests again.
But, as I remember, I had the problem in all versions


#14

Thanks. It looks like you did mention it but only as “new version” so knowing which one is good.

With 2.0.3.7 does --backup-test-samples=0 still stop the errors and allow a backup to run?

If you what up in 2.0.3.7 again, try setting the test samples to 0. I suspect it won’t help for you as it did for twobi since you’re getting warnings with backups that run while I believe he’s getting errors with backups that fail. So most likely you’re running into a different issue.

If you do re-update, consider starting a new topic and include the text of the error messages you are getting.


#15

With 2.0.3.7 does --backup-test-samples=0 still stop the errors and allow a backup to run?

Backup is running smooth and quick and end without warning and errors. Thank you for your investigation!