After using Duplicati for a few weeks, one sort of figured out a few things, but not others. Yet searching for a troubleshooting guide seems to… not exist?
For example, what does one do when one runs into some of the following:
ERROR: “Cryptographic Exception: Invalid Header”
When does one run compact? Or Repair? Or Delete/Repair?
How does one just nuke all the remote files, but keep the job definition, and run the job from scratch? (i.e. start over?)
And so on and so forth. Is there a “noobs read this first” document?
Hi @Kasey_Chang, welcome to the forum 
True, we do not have a case-by-case guide. You can usually search the forum and find the answer.
This indicates that the file was expected to be an encrypted file, but was not. I need a little more context to say more.
Generally you should not run these.
Compact runs automatically when the backup has too much wasted space, and unless you have disabled automatic compacting, there is usually no gain from running it manually.
The repair should only be run if you have problems where the database and the remote storage are out of sync. This should generally not happen, and is usually an indication of either faulty storage or a bug in Duplicati.
The delete/repair is a “I give up, assume the remote storage is correct, and go from there”-type action.
That is not something that is a feature (because it is dangerous, and mostly for testing things).
What you can do, is export the job configuration to a JSON file.
Then delete the backup and choose to also delete remote files.
Then import the job configuration again.
Not really, there is the documentation that explains how to set it up:
But that is quite high-level compared to the things you are asking.
But do keep asking, then we have a thread here with answers 
Okay, right now, the backup has finished, with 27 warnings and 68 errors:
The warnings are of 2 types:
a) Missing remote hash, one of the files is listed as verified with size, but is the wrong size
b) FileAccess Error – this one I can deal with… I exclude this subdir from the backup
The Errors, however, I have no idea how to get rid of:
FailedToProcessFile: Failed to process file duplicati-iXXX.dindex.zip.aes CryptographicException: Invalid header value: 00-00-00-00-00
That one is probably related to the cryptographic error.
It means that Duplicati has recorded the file size and hash prior to storing the file on the destination.
When checking what is on the destination, the size (and/or hash) has changed, indicating that either the file was tampered with or the transfer failed in some way that was not detected by Duplicati.
This one means that the file is not an AES encrypted file. The first 5 bytes are all zero, but they should have the AES header. Can you examine the files manually and see if they are all zeroes? Are there any clues as to why a file would be stored with zeroes in the header?
I have not heard of Duplicati ever creating a file like that, so I would guess it is something else that is the problem. Can you reveal what protocol or provider you are using for the storage?
Just regular internal hard drive. I am guessing the SouthBridge may have overheated and lost connection with the hard drive during the long backup session. Some of my prior backup sessions did result in i/o errors and unresponsive drive. That’s my hardware problem. It seems to be fixed, at least for the moment (turned all fans to 100%)
But the question is what do I do to get rid of the errors? Or do I have to go in and delete each bad file manually?
Yes, you need to delete the files manually. There is no functionality in Duplicati that will delete files that are not looking correct.
Maybe there should be? Hide it under that confirmation prompt like when you purge the backup set. I am probably using the wrong terminology.
Anyway, this is what I did to get rid of the 68 errors.
- Go to the log and list the files with the cryptographic errors (it only lists 20 at a time)
- Go to the fileset and delete each one, using search. The first 5 characters (not counting “Duplicati-” should be enough to find that specific file
- When you delete all 20 (or whatever’s left), do database//repair (should only take 1 second)
- Do ANOTHER backup. This time, it should go quite a bit faster, as Duplicati doesn’t have to retry each file 5 times during “verify files” phase
- Your errors should decrease by 20. (68 → 48)
- Repeat the process until all errors are eliminated (68 → 48 → 28 → 8 → 0)
Instead of taking 1 hour 15 min, backup now takes less than 15 minutes.
1 Like
Thanks for reporting the method.
I have registered an issue for getting an automated version of this.