Failed: Invalid header marker

I made the 100mb files

Descarga Outlook para iOS

had the same issue again.
restarting duplicati fixed it in my case

I migrated to canary docker container for my installation because I did not trust the qnap mono implementation.

But still the same :frowning: My 1,7TB backup is throwing the same error over and over again, and sometimes a restart of the container fixes is, sometimes not.
Fun fact, the second job with 25GB is always running through without an error.

Both have the same destination, use the same protocol to transfer (SSH) and other setting are similar only the size differs.

This is starting to bug me
Running Duplicati - 2.0.4.17_canary_2019-04-11

System.Security.Cryptography.CryptographicException: Invalid header marker ---> System.IO.InvalidDataException: Invalid header marker at SharpAESCrypt.SharpAESCrypt.ReadEncryptionHeader (System.String password, System.Boolean skipFileSizeCheck) [0x00054] in <5e494c161b2e4b968957d9a34bd81877>:0 at SharpAESCrypt.SharpAESCrypt..ctor (System.String password, System.IO.Stream stream, SharpAESCrypt.OperationMode mode, System.Boolean skipFileSizeCheck) [0x001af] in <5e494c161b2e4b968957d9a34bd81877>:0 at (wrapper remoting-invoke-with-check) SharpAESCrypt.SharpAESCrypt..ctor(string,System.IO.Stream,SharpAESCrypt.OperationMode,bool) at Duplicati.Library.Encryption.AESEncryption.Decrypt (System.IO.Stream input) [0x00000] in <8a774389440c4ca192f23b405ad0f041>:0 at Duplicati.Library.Encryption.EncryptionBase.Decrypt (System.IO.Stream input, System.IO.Stream output) [0x00000] in <8a774389440c4ca192f23b405ad0f041>:0 at Duplicati.Library.Main.BackendManager+<>c__DisplayClass36_0.<coreDoGetPiping>b__0 () [0x00029] in <e737745a39a143f09a82fd4f2eaa262c>:0 at System.Threading.Tasks.Task.InnerInvoke () [0x0000f] in <3833a6edf2074b959d3dab898627f0ac>:0 at System.Threading.Tasks.Task.Execute () [0x00000] in <3833a6edf2074b959d3dab898627f0ac>:0 --- End of inner exception stack trace --- at Duplicati.Library.Main.AsyncDownloader+AsyncDownloaderEnumerator+AsyncDownloadedFile.get_TempFile () [0x00008] in <e737745a39a143f09a82fd4f2eaa262c>:0 at Duplicati.Library.Main.Operation.CompactHandler.DoCompact (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Boolean hasVerifiedBackend, System.Data.IDbTransaction& transaction, Duplicati.Library.Main.BackendManager sharedBackend) [0x0026c] in <e737745a39a143f09a82fd4f2eaa262c>:0 at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00399] in <e737745a39a143f09a82fd4f2eaa262c>:0 at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <e737745a39a143f09a82fd4f2eaa262c>:0 at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x01028] in <e737745a39a143f09a82fd4f2eaa262c>:0 at CoCoL.ChannelExtensions.WaitForTaskOrThrow (System.Threading.Tasks.Task task) [0x00050] in <6973ce2780de4b28aaa2c5ffc59993b1>:0 at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x00008] in <e737745a39a143f09a82fd4f2eaa262c>:0 at Duplicati.Library.Main.Controller+<>c__DisplayClass13_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x00035] in <e737745a39a143f09a82fd4f2eaa262c>:0 at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x00271] in <e737745a39a143f09a82fd4f2eaa262c>:0 at Duplicati.Library.Main.Controller.Backup (System.String[] inputsources, Duplicati.Library.Utility.IFilter filter) [0x00068] in <e737745a39a143f09a82fd4f2eaa262c>:0 at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x002f7] in <c71fcbcde8624d2e8a8aa9e50b627014>:0 

I did some testing the last days, and as soon as I set the backup retention to “keep all backups” the error is gone. Before that I had “smart backup retention”

What does this mean, is my backup now broken, because for me it seems like Duplicati is unable to open older backup files to “delete” them based on the former retention setting.

This happened again today, with another backup set I have, but luckily the next backup did not raise the same error again.
I am not sure why and what triggers it, but I slowly start doubting the reliability of my backups. Any ideas or information I can provide to help fixing this.

Although it was asked lightly earlier, let me ask if anybody is using any sort of speed throttle on upload or download, e.g. from –throttle-upload, –throttle-download or the GUI throttle button at the top of the page?

Upload throttle corrupts backup, especially OneDrive. Analyzed, with code proposed. #3787
(the OneDrive comment just means it was the worst of the several tested – there could easily be more)

Determining the nature of the corruption would help, but may involve looking in SQLite databases (not so difficult, but probably needs DB Browser for SQLite or similar installed). Some simpler tests are to check size of the corrupted file, convert to hexadecimal, and see if it looks too round. That might be an FS error. –upload-verification-file will make a text file duplicati-verification.json that describes expected remote files. Running DuplicatiVerify.* from the Duplicati utility-scripts folder can test destination files look as expected, assuming you can either get that sort of destination access, or are willing to get the files more accessible, however if doing a complete download is required, you might just as well run the test command over all. That should do a hash check of all remote files, but you can also do your own simple manual checking of some particular file by looking it up in duplicati-verification.json to at least see if actual size is as expected.

Stepping back to more ordinary steps, logs can be good to see what led up to the problem, however with problems with older files (how old is the bad file based on remote info?), you can’t get details retroactively. Sometimes one might get lucky, and find relevant info in default job logs or About --> Show log --> Stored. Beyond that, use –log-file with –log-file-log-level=retry (reasonable compromise, but other levels may do).

Thanks, lots of mights and could lookup and find something :wink:
I do have a backup set which I have left in place but not using it any more because it raised the above error everytime I run it.
I had the error once with another backup set, so would I would offer is my time and backup sets to find the error, but someone with the capabilities of fixing the issue in the end needs to guide me.
I would say a step by step approach would be great, can also be on github or via mail or sth like this, but I would appreciate if this problem can be fixed with the time everyone spends. Otherwise it is a collection of data nobody is going to use anyone.

So who is in to get this hopefully figured out with the backup sets I do have and perhaps someone else is going to jump into it as well.

Asking again. If the answer is “yes”, the code fix may just need to be done. If “no”, then the big chase starts. Generally there can be no promises that a chase will result immediately in a fix, but data helps in any event.

Best way to preserve data is not in forum (where findings are scattered), but in GitHub issues (now at 820).

Some of the ideas were for this, not specifically for chasing the issue. Rather than doubt, one can check… Alternatively (and easier), one can add Advanced option –backup-test-samples to raise routine verification.

cuts it down to a fraction of the small number of active developers (busy with 820 issues), but we’ll see…

Thanks, lets work on some things using the forum, afterwards I collect and summarize the findings on github.

Concerning the throttling, I am not doing any throttling, the backup set which started to fail some time ago, and still failing does have the following “switches” set. I just used the “export” functionality of the set to provide the information. The others are configured very similar, and besides one error 2 days ago they work just fine.

mono /opt/duplicati/Duplicati.CommandLine.exe backup “ssh://x.x.x.:22/mnt/backup/qpkg/?auth-username=xxx&ssh-fingerprint=ssh-rsa 2048 16:43:4A:A0:B1:8C:81:76:6B:8B:FD:86:9F:AC:08:A0” /share/QPKG/ /share/Container/data/ --backup-name=“Service Configs [remote]” --dbpath=/data/Duplicati/74776788828989867881.sqlite --encryption-module=aes --compression-module=zip --dblock-size=50MB --keep-time=6M --blocksize=500KB --backup-test-samples=1 --disable-file-scanner=true --send-http-url=https://www.duplicati-monitoring.com/log/xxx --exclude-files-attributes=temporary --disable-module=console-password-input

This still happens, not sure why.
Some backups run just fine, on other days it fails with the invalid header marker error.

I have no clue how to get more information collected, without getting help by someone else :frowning:

This is an ongoing issue and it would be great if someone could just help a little.
I would even give access to the instance if this would speed things up, but the last 7 days every backup was failing because of this.
This makes me nervous, it seems that restore still works for what I have tested, but for how long?

THANKS

I gave nearly ten approaches earlier. Are none of them even remotely meaningful? I could elaborate, however there aren’t nearly enough volunteers (of all sorts) to allow on-site debugging, even if it was technically feasible, which it frequently isn’t due to limited tools, need for highly special expertise, etc.

How are you testing restore? It’s unfortunately not very meaningful if to the same system with the original files still around, because original files will be checked and used for restore material that’s relevant to the requested restore – using local data blocks will run faster than downloading blocks.

–no-local-blocks added and checkboxed in Advanced options on job screen 5 Options will disable optimization of this sort. Direct restore from backup files to another machine won’t have this issue.

We need to find out better how badly your backup might be damaged. Ideally you would be able to direct restore all of it to a different system, which simulates recovery from disaster loss of originals.

I’m not sure what constraints you have on disk space, bandwidth, or metered service that interfere. What level of access do you have to the SFTP server? Can you directly access files? Run scripts?
Do you know what SFTP server you have, and what access you have to examine the files closely?

At what point do they fail? There was a theory earlier that compact runs and hits some bad old files. Verifying backend files after backup seems more likely because compact likely won’t run every time.
EDIT: Viewing the log files of a backup job will show CompactResults from compact run if it did run. RetryAttempts being non-zero would be expected in a failure case because it retries before it fails.

In Advanced options, adding and checking –no-auto-compact can avoid its downloads. Adding and setting to 0 –backup-test-samples can prevent the downloads after backup. These are only for tests because they hide the issue, however they do help confirm or refute that you have some bad files…

If you have enough bad files, starting over anew is probably the best plan, if that would be an option.

I just wanted to add that I made similar experiences. A backup startet about a year ago worked fine until three weeks ago it stopped with the invalid header marker error (v 2.0.4.5 under Windows 10, backing up to onedriveV2, usually not throttled, 50M block size).

Error message were not very helpful as they did not contain any specific information regarding a file:

System.Security.Cryptography.CryptographicException: Invalid header marker —> System.IO.InvalidDataException: Invalid header marker

bei SharpAESCrypt.SharpAESCrypt.ReadEncryptionHeader(String password, Boolean skipFileSizeCheck)

bei SharpAESCrypt.SharpAESCrypt…ctor(String password, Stream stream, OperationMode mode, Boolean skipFileSizeCheck)

bei Duplicati.Library.Encryption.AESEncryption.Decrypt(Stream input)

bei Duplicati.Library.Encryption.EncryptionBase.Decrypt(Stream input, Stream output)

bei Duplicati.Library.Main.BackendManager.<>c__DisplayClass36_0.b__0()

bei System.Threading.Tasks.Task.Execute()

— Ende der internen Ausnahmestapelüberwachung —

bei Duplicati.Library.Main.AsyncDownloader.AsyncDownloaderEnumerator.AsyncDownloadedFile.get_TempFile()

bei Duplicati.Library.Main.Operation.CompactHandler.DoCompact(LocalDeleteDatabase db, Boolean hasVerifiedBackend, IDbTransaction& transaction, BackendManager sharedBackend)

bei Duplicati.Library.Main.Operation.DeleteHandler.DoRun(LocalDeleteDatabase db, IDbTransaction& transaction, Boolean hasVerifiedBacked, Boolean forceCompact, BackendManager sharedManager)

bei Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired(BackendManager backend, Int64 lastVolumeSize)

bei Duplicati.Library.Main.Operation.BackupHandler.d__20.MoveNext()

— Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde —

bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()

bei CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)

bei Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.b__0(BackupResults result)

bei Duplicati.Library.Main.Controller.RunAction[T](T result, String& paths, IFilter& filter, Action`1 method)

bei Duplicati.Library.Main.Controller.Backup(String inputsources, IFilter filter)

bei Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

I couldn’t determine the cause. I tried different things like rebuilding the database, switching between smart retention and other retention strategies, but nothing helped. I just found that according to duplicati the backup had a size of ~148 GB and according to onedrive the folder was ~249 so it must have been gone out of sync somehow. As I fiddeled around a bit over the year I have no idea what exactly might have contributed to going out of sync. I started a new backup as that was the least time consuming way to continue backups.

But it would be great if Duplicati would offer some remedy in the case of invalid header markers, like replacing the corrupt file with a new one if the files still exist in the same version as in the corrupt file or leading the user to a step-by-step solution of the problem and/or produce error messages with better information what went wrong.

Welcome to the forum @llkbennprz

See Upload throttle corrupts backup, especially OneDrive. Analyzed, with code proposed. #3787

If you happen to have your old backup (after starting the new one) you can look for its symptoms multiple ways as listed earlier. If you can identify the bad file, you can look at it with a hex viewer.

Commonly, it seemed like larger files such as the default 50MB dblock were more likely to get hit. Dblock file problems are worse than dlist and dindex because the latter can be fixed by individual manual delete and then Repair. Assuming the problem isn’t in the DB by Recreate, DB will fix file.

Dblock files are the file data so for example, if damage is to a file now deleted, its backup is gone.
The AFFECTED command can show what the loss of a given destination files means to a backup.

Fix implementation of ThrottledStream.Write #3811 was the fix. Throttling even once could have damaged some files, and usually not throttling would mean that they might take awhile to notice, although verification does try to test-balance its sample, so I’d have expected it to notice sooner.

Regardless, that’s my guess for OneDrive throttle issue that comes back in the form of bad files. Additional information might be able to refine or refute it. There’s no beta yet with that throttle fix.

This might be what you want (I’m not 100% certain), but in practice it didn’t work. It’s now off by default. I’m also noticing that the Advanced Options doesn’t show that option, but now it’d need to be re-tested:

v2.0.3.10-2.0.3.10_canary_2018-08-30

Removed automatic attempts to rebuild dblock files as it is slow and rarely finds all the missing pieces (can be enabled with --rebuild-missing-dblock-files ).

@ts678
Thanks for everything, I decided to try everything you mentioned to figure this out. But because I lost a little track of what I did, or did not do I created an article which summarizes my steps

https://www.feutl.com/blog/duplicati-issue-failed-invalid-header-marker

Anyhow, I started to generate the “extended” logs first to see what they can reveal. This gonna take some days to figure things out, but if you have any advice not said yet, let me know I add it to my list.

I keep you updated.

AES File Format is what’s expected as .aes file header, otherwise you get “Invalid header marker” error.

 3 Octets - 'AES'
 1 Octet  - 0x02 (Version)
 1 Octet  - Reserved (set to 0x00)

If your Raspberry Pi has these typical commands, you can cd to your backup and survey for bad files by

find . -type f \( -execdir bash -c '[[ `xxd -plain -l 5 {}` == '4145530200' ]]' \; -or -print \)

EDIT Changed forum source to use triple instead of single backticks to keep command backticks visible.

which is the same plan – the header must be correct or you’ll hear. If you get a bad header, what was it?

I am going to test the find statement as one of the next things, but first this is the output of my last backup. I removed some lines because they were just repeated with other dblocks or dlists
Both files throwing the error still exist on the system.

2019-11-18 21:00:01 +00 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2019-11-18 22:35:12 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2019-11-18 22:37:16 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (23.50 KB)
2019-11-18 22:39:58 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b8e36b4cc806744528068136de4468f9b.dblock.zip.aes (49.80 MB)
2019-11-18 22:40:54 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b8e36b4cc806744528068136de4468f9b.dblock.zip.aes (49.80 MB)
<...omitted...>
2019-11-19 00:39:14 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20191118T210001Z.dlist.zip.aes (62.31 MB)
2019-11-19 00:40:11 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20191118T210001Z.dlist.zip.aes (62.31 MB)
<...omitted...>
2019-11-19 03:29:24 +00 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting because there are 5 fully deletable volume(s)
2019-11-19 03:29:28 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-b8226c47f8aee4f08ae3b7dd32afd72d3.dblock.zip.aes (249.62 MB)
2019-11-19 03:29:34 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-b8226c47f8aee4f08ae3b7dd32afd72d3.dblock.zip.aes (249.62 MB)
<...omitted...>
2019-11-19 03:31:20 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-bd72accde1ec049dc99bc453d7d9e92cf.dblock.zip.aes (249.96 MB)
2019-11-19 03:38:59 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-bd72accde1ec049dc99bc453d7d9e92cf.dblock.zip.aes (249.96 MB)
2019-11-19 03:38:59 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-bd742daac0957402f81e85c35a5be6662.dblock.zip.aes (249.81 MB)
2019-11-19 03:46:36 +00 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-bd742daac0957402f81e85c35a5be6662.dblock.zip.aes attempt 1 of 5 failed with message: Invalid header marker
<...omitted...>
2019-11-19 04:10:07 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-bd742daac0957402f81e85c35a5be6662.dblock.zip.aes (249.81 MB)
2019-11-19 04:10:17 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-bd742daac0957402f81e85c35a5be6662.dblock.zip.aes (249.81 MB)
2019-11-19 04:18:00 +00 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-bd742daac0957402f81e85c35a5be6662.dblock.zip.aes attempt 5 of 5 failed with message: Invalid header marker
System.Security.Cryptography.CryptographicException: Invalid header marker ---> System.IO.InvalidDataException: Invalid header marker
  at SharpAESCrypt.SharpAESCrypt.ReadEncryptionHeader (System.String password, System.Boolean skipFileSizeCheck) [0x0004b] in <45c43198e80545aeabaa9b98839b4c7d>:0 
  at SharpAESCrypt.SharpAESCrypt..ctor (System.String password, System.IO.Stream stream, SharpAESCrypt.OperationMode mode, System.Boolean skipFileSizeCheck) [0x0018a] in <45c43198e80545aeabaa9b98839b4c7d>:0 
  at (wrapper remoting-invoke-with-check) SharpAESCrypt.SharpAESCrypt..ctor(string,System.IO.Stream,SharpAESCrypt.OperationMode,bool)
  at Duplicati.Library.Encryption.AESEncryption.Decrypt (System.IO.Stream input) [0x00000] in <a3217bd0ca2d4a6bb2457ef1971c41fc>:0 
  at Duplicati.Library.Encryption.EncryptionBase.Decrypt (System.IO.Stream input, System.IO.Stream output) [0x00000] in <a3217bd0ca2d4a6bb2457ef1971c41fc>:0 
  at Duplicati.Library.Main.BackendManager+<>c__DisplayClass36_0.<coreDoGetPiping>b__0 () [0x00029] in <759bd83d98134a149cdf84e129a07d38>:0 
  at System.Threading.Tasks.Task.InnerInvoke () [0x0000f] in <3833a6edf2074b959d3dab898627f0ac>:0 
  at System.Threading.Tasks.Task.Execute () [0x00000] in <3833a6edf2074b959d3dab898627f0ac>:0 
   --- End of inner exception stack trace ---
  at Duplicati.Library.Main.BackendManager.coreDoGetPiping (Duplicati.Library.Main.BackendManager+FileEntryItem item, Duplicati.Library.Interface.IEncryption useDecrypter, System.Int64& retDownloadSize, System.String& retHashcode) [0x002d3] in <759bd83d98134a149cdf84e129a07d38>:0 
  at Duplicati.Library.Main.BackendManager.DoGet (Duplicati.Library.Main.BackendManager+FileEntryItem item) [0x002ff] in <759bd83d98134a149cdf84e129a07d38>:0 
  at Duplicati.Library.Main.BackendManager.ThreadRun () [0x000ff] in <759bd83d98134a149cdf84e129a07d38>:0 
<...omitted...>
2019-11-19 04:20:17 +00 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-bd743e3577a764e6eabba37e83ec36cac.dblock.zip.aes (249.59 MB)
2019-11-19 04:20:28 +00 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-bd743e3577a764e6eabba37e83ec36cac.dblock.zip.aes attempt 1 of 5 failed with message: Thread was being aborted.
System.Threading.ThreadAbortException: Thread was being aborted.
  at (wrapper managed-to-native) System.Threading.Monitor.Monitor_wait(object,int)
  at System.Threading.Monitor.ObjWait (System.Boolean exitContext, System.Int32 millisecondsTimeout, System.Object obj) [0x0002f] in <3833a6edf2074b959d3dab898627f0ac>:0 
  at System.Threading.Monitor.Wait (System.Object obj, System.Int32 millisecondsTimeout, System.Boolean exitContext) [0x0000e] in <3833a6edf2074b959d3dab898627f0ac>:0 
  at System.Threading.Monitor.Wait (System.Object obj) [0x00000] in <3833a6edf2074b959d3dab898627f0ac>:0 
  at Renci.SshNet.Sftp.SftpFileReader.Read () [0x0004f] in <c60291af03764e69b153abec058a6ec5>:0 
  at Renci.SshNet.SftpClient.InternalDownloadFile (System.String path, System.IO.Stream output, Renci.SshNet.Sftp.SftpDownloadAsyncResult asyncResult, System.Action`1[T] downloadCallback) [0x00078] in <c60291af03764e69b153abec058a6ec5>:0 
  at Renci.SshNet.SftpClient.DownloadFile (System.String path, System.IO.Stream output, System.Action`1[T] downloadCallback) [0x00006] in <c60291af03764e69b153abec058a6ec5>:0 
  at Duplicati.Library.Backend.SSHv2.Get (System.String remotename, System.IO.Stream stream) [0x00012] in <40ac48623b834b889568d306e7f7925b>:0 
  at Duplicati.Library.Main.BackendManager.coreDoGetPiping (Duplicati.Library.Main.BackendManager+FileEntryItem item, Duplicati.Library.Interface.IEncryption useDecrypter, System.Int64& retDownloadSize, System.String& retHashcode) [0x00262] in <759bd83d98134a149cdf84e129a07d38>:0 
  at Duplicati.Library.Main.BackendManager.DoGet (Duplicati.Library.Main.BackendManager+FileEntryItem item) [0x002ff] in <759bd83d98134a149cdf84e129a07d38>:0 
  at Duplicati.Library.Main.BackendManager.ThreadRun () [0x000ff] in <759bd83d98134a149cdf84e129a07d38>:0 
2019-11-19 04:20:28 +00 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.Security.Cryptography.CryptographicException: Invalid header marker ---> System.IO.InvalidDataException: Invalid header marker
  at SharpAESCrypt.SharpAESCrypt.ReadEncryptionHeader (System.String password, System.Boolean skipFileSizeCheck) [0x0004b] in <45c43198e80545aeabaa9b98839b4c7d>:0 
  at SharpAESCrypt.SharpAESCrypt..ctor (System.String password, System.IO.Stream stream, SharpAESCrypt.OperationMode mode, System.Boolean skipFileSizeCheck) [0x0018a] in <45c43198e80545aeabaa9b98839b4c7d>:0 
  at (wrapper remoting-invoke-with-check) SharpAESCrypt.SharpAESCrypt..ctor(string,System.IO.Stream,SharpAESCrypt.OperationMode,bool)
  at Duplicati.Library.Encryption.AESEncryption.Decrypt (System.IO.Stream input) [0x00000] in <a3217bd0ca2d4a6bb2457ef1971c41fc>:0 
  at Duplicati.Library.Encryption.EncryptionBase.Decrypt (System.IO.Stream input, System.IO.Stream output) [0x00000] in <a3217bd0ca2d4a6bb2457ef1971c41fc>:0 
  at Duplicati.Library.Main.BackendManager+<>c__DisplayClass36_0.<coreDoGetPiping>b__0 () [0x00029] in <759bd83d98134a149cdf84e129a07d38>:0 
  at System.Threading.Tasks.Task.InnerInvoke () [0x0000f] in <3833a6edf2074b959d3dab898627f0ac>:0 
  at System.Threading.Tasks.Task.Execute () [0x00000] in <3833a6edf2074b959d3dab898627f0ac>:0 
   --- End of inner exception stack trace ---
  at Duplicati.Library.Main.AsyncDownloader+AsyncDownloaderEnumerator+AsyncDownloadedFile.get_TempFile () [0x00008] in <759bd83d98134a149cdf84e129a07d38>:0 
  at Duplicati.Library.Main.Operation.CompactHandler.DoCompact (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Boolean hasVerifiedBackend, System.Data.IDbTransaction& transaction, Duplicati.Library.Main.BackendManager sharedBackend) [0x00264] in <759bd83d98134a149cdf84e129a07d38>:0 
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00397] in <759bd83d98134a149cdf84e129a07d38>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <759bd83d98134a149cdf84e129a07d38>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x00d9f] in <759bd83d98134a149cdf84e129a07d38>:0 

This throws an error and because I am not fully understanding what this script does, I cannot fix it on my own

bash: -c: Zeile 0: conditional binary operator expected
bash: -c: Zeile 0: Syntaxfehler beim unerwarteten Wort `-plain'
bash: -c: Zeile 0: `[[ xxd -plain -l 5 ./duplicati-b02ac2b3df2ab426ea03948e12f2ccb15.dblock.zip.aes == 4145530200 ]]'
./duplicati-b02ac2b3df2ab426ea03948e12f2ccb15.dblock.zip.aes

Sorry. Forum software ate some backticks that were supposed to be around the xxd portion of the line.
You can cut-and-paste the fixed version, and see if it does better. Here’s the general plan of the script:

Backticks cause the string between them to be run, and the output to be returned. In this case, it uses
xxd to dump the first five characters as numbers which should be 4145530200 which is compared (==) against that string simply hardcoded into the script. This is done inside a bash expression ([[ and ]]) and result is returned. If true (i.e. got what was expected), the expression in find has a known result without evaluating the -or, so the print is not done. If it got something other than the expected header, the -or is evaluated to see if the expression might actually be true, and evaluating the -print prints the path of file.

That’s much like Duplicati’s check, and I had described it as “the header must be correct or you’ll hear”.

Even though we didn’t get a file audit, we now have one bad file actually named in the post just earlier.

So what does the file look like in terms of things like length and header? Run xxd as above if you like, however hexdump -C on the file will make a more human friendly display. A longer header will be like:

00000000  41 45 53 02 00 00 21 43  52 45 41 54 45 44 5f 42  |AES...!CREATED_B|
00000010  59 00 53 68 61 72 70 41  45 53 43 72 79 70 74 20  |Y.SharpAESCrypt |
00000020  76 31 2e 33 2e 31 2e 30  00 80 00 00 00 00 00 00  |v1.3.1.0........|
00000030  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|

For anything but a very short file, you probably want to pipe (| which is veritical bar) output into head:

hexdump -C duplicati-bd742daac0957402f81e85c35a5be6662.dblock.zip.aes | head

For more serious review (if it comes to that), you can pipe into less, or redirect into a file for an editor.

If the issue turns out to be an empty file, you can quickly audit using something like ls -lSr | head

Seems there are only 3 files with a broken header - which is good, but still no clue why.
Also those files are quite old, the most properly hold information which gut de-duplicated, and did not get deleted therefor.
Anyhow, is there a way to resolve the issue somehow, after we have found the bad ones :wink:

This are the files, based on the above command:

  • ./nextcloud-data/duplicati-bd742daac0957402f81e85c35a5be6662.dblock.zip.aes
    • Date: 07.11.2018
    • Size: 250MB
  • ./nextcloud-data/duplicati-bfcc04033b8f54b61a1c91d24fa04fb17.dblock.zip.aes
    • Date: 13.09.2018
    • Size: 250MB
  • ./nextcloud-data/duplicati-b3ab98cd26a2e469fa0c8262c7b96ff87.dblock.zip.aes
    • Date: 03.10.2018
    • Size: 250MB

At least one file matches the information of the log file I got with the extended logging turned on. The outcome of the head information on a broken file and another random file

Broken ones

# hexdump -C duplicati-bd742daac0957402f81e85c35a5be6662.dblock.zip.aes | head
00000000  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
*
02cfb6e0  01 0d c6 32 79 2f 36 61  ef 11 79 5c 6a 72 76 b6  |...2y/6a..y\jrv.|
02cfb6f0  e3 9d a8 b2 b3 da a9 78  c3 8b 66 76 05 1c 0b fc  |.......x..fv....|
02cfb700  92 84 82 d9 ab 99 18 82  92 24 bc 3e f6 ce 19 df  |.........$.>....|
02cfb710  64 56 c7 5b 4d 52 1f c4  c5 87 dc 85 29 66 85 25  |dV.[MR......)f.%|
02cfb720  7d e8 13 c7 e3 a9 9b dc  2f ee 8e 23 34 32 c7 6f  |}......./..#42.o|
02cfb730  e2 15 7f 45 63 07 39 17  5e 28 2f 7a 7e 15 b6 b3  |...Ec.9.^(/z~...|
02cfb740  24 83 86 0e 9d b4 8d 66  88 df 9a b5 6f 2e 0c fb  |$......f....o...|
02cfb750  70 c6 d9 f6 8a 89 77 ac  bd ab 22 d5 29 75 9f b2  |p.....w...".)u..|

# hexdump -C duplicati-bfcc04033b8f54b61a1c91d24fa04fb17.dblock.zip.aes | head
00000000  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
*
00037fa0  00 00 00 00 00 a6 e6 14  97 72 db 12 92 3c ad ea  |.........r...<..|
00037fb0  2c 6e 84 0d 47 4f ac 1e  4c 8d 00 38 41 83 e7 35  |,n..GO..L..8A..5|
00037fc0  45 3e b0 88 2c 0d d6 b7  3f 01 ee f4 2a 5e c6 09  |E>..,...?...*^..|
00037fd0  99 6f 85 f5 0f ba 3f 13  b3 87 c5 54 0d f8 92 84  |.o....?....T....|
00037fe0  be ba de ea 27 f5 05 0b  66 f7 83 9b 2b 09 08 eb  |....'...f...+...|
00037ff0  a7 ce 5c 50 76 5d 2d 13  68 d8 fe 53 09 c9 01 61  |..\Pv]-.h..S...a|
00038000  58 0f 2a 12 80 40 70 3d  15 5d 85 fe bf 87 d2 59  |X.*..@p=.].....Y|
00038010  f5 ec 0c 1c 78 a2 84 27  4e fe 77 46 4a 26 c1 bd  |....x..'N.wFJ&..|

# hexdump -C duplicati-b3ab98cd26a2e469fa0c8262c7b96ff87.dblock.zip.aes | head
00000000  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
*
0503fd80  00 00 00 00 00 00 00 00  00 00 00 8b 7e 30 3b f0  |............~0;.|
0503fd90  97 1b dd 32 76 59 66 f1  a7 bc 76 f6 7a 78 65 41  |...2vYf...v.zxeA|
0503fda0  26 8c c5 fe 58 e0 7c 6e  9d 5b e4 d1 2c 98 31 22  |&...X.|n.[..,.1"|
0503fdb0  f3 0a 62 9a 70 ad 18 4d  c7 09 f5 52 f2 ff b0 9c  |..b.p..M...R....|
0503fdc0  d7 37 bf b7 ae 3f b7 eb  7d e9 fa 67 b3 7b e7 aa  |.7...?..}..g.{..|
0503fdd0  c3 a5 62 de f1 93 60 d9  51 b9 cb 90 33 4e 09 50  |..b...`.Q...3N.P|
0503fde0  e2 40 12 ae 9a c6 f0 27  89 18 5f a9 bb cc 4b 25  |.@.....'.._...K%|
0503fdf0  d5 3c c5 7b 1c cd 10 d0  5c 64 ec a7 69 32 93 20  |.<.{....\d..i2. |

Good one

# hexdump -C duplicati-bf6a3048f8e9a44aca242f02ff5c887a4.dblock.zip.aes | head
00000000  41 45 53 02 00 00 21 43  52 45 41 54 45 44 5f 42  |AES...!CREATED_B|
00000010  59 00 53 68 61 72 70 41  45 53 43 72 79 70 74 20  |Y.SharpAESCrypt |
00000020  76 31 2e 33 2e 33 2e 30  00 80 00 00 00 00 00 00  |v1.3.3.0........|
00000030  00 00 00 00 00 00 00 00  00 00 00 00 00 00 00 00  |................|
*
000000a0  00 00 00 00 00 00 00 00  00 00 00 00 13 05 52 17  |..............R.|
000000b0  ce fc 96 11 b4 1b b7 97  03 ce f9 76 bc ec 8e 81  |...........v....|
000000c0  67 33 4f da b7 67 8f 0f  c4 4f 35 27 4e ed 21 bc  |g3O..g...O5'N.!.|
000000d0  c4 a1 49 28 f0 a9 86 c3  bb e0 26 e6 37 f1 46 41  |..I(......&.7.FA|
000000e0  43 8c e4 1b 27 28 e7 b4  cd 15 15 ba 76 6d 70 c4  |C...'(......vmp.|