File length is invalid?


#1

Tried to find anything in the error message to understand what went wrong but this one is hard to understand. It’s a backup of Hyper-V guests:

Failed: File length is invalid
Details: System.Security.Cryptography.CryptographicException: File length is invalid ---> System.IO.InvalidDataException: File length is invalid
at SharpAESCrypt.SharpAESCrypt.Read(Byte[] buffer, Int32 offset, Int32 count)
at Duplicati.Library.Utility.Utility.CopyStream(Stream source, Stream target, Boolean tryRewindSource, Byte[] buf)
at Duplicati.Library.Encryption.EncryptionBase.Decrypt(Stream input, Stream output)
at Duplicati.Library.Main.BackendManager.<>c__DisplayClass36_0.<coreDoGetPiping>b__0()
at System.Threading.Tasks.Task.Execute()
--- End of inner exception stack trace ---
at Duplicati.Library.Main.AsyncDownloader.AsyncDownloaderEnumerator.AsyncDownloadedFile.get_TempFile()
at Duplicati.Library.Main.Operation.CompactHandler.DoCompact(LocalDeleteDatabase db, Boolean hasVerifiedBackend, IDbTransaction& transaction, BackendManager sharedBackend)
at Duplicati.Library.Main.Operation.DeleteHandler.DoRun(LocalDeleteDatabase db, IDbTransaction& transaction, Boolean hasVerifiedBacked, Boolean forceCompact, BackendManager sharedManager)
at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired(BackendManager backend, Int64 lastVolumeSize)
at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
at Duplicati.Library.Main.Controller.<>c__DisplayClass13_0.<Backup>b__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)

Log data:
2018-12-22 04:04:03 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.Security.Cryptography.CryptographicException: File length is invalid ---> System.IO.InvalidDataException: File length is invalid
at SharpAESCrypt.SharpAESCrypt.Read(Byte[] buffer, Int32 offset, Int32 count)
at Duplicati.Library.Utility.Utility.CopyStream(Stream source, Stream target, Boolean tryRewindSource, Byte[] buf)
at Duplicati.Library.Encryption.EncryptionBase.Decrypt(Stream input, Stream output)
at Duplicati.Library.Main.BackendManager.<>c__DisplayClass36_0.<coreDoGetPiping>b__0()
at System.Threading.Tasks.Task.Execute()
--- End of inner exception stack trace ---
at Duplicati.Library.Main.AsyncDownloader.AsyncDownloaderEnumerator.AsyncDownloadedFile.get_TempFile()
at Duplicati.Library.Main.Operation.CompactHandler.DoCompact(LocalDeleteDatabase db, Boolean hasVerifiedBackend, IDbTransaction& transaction, BackendManager sharedBackend)
at Duplicati.Library.Main.Operation.DeleteHandler.DoRun(LocalDeleteDatabase db, IDbTransaction& transaction, Boolean hasVerifiedBacked, Boolean forceCompact, BackendManager sharedManager)
at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired(BackendManager backend, Int64 lastVolumeSize)
at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()

#2

Oh, I just noticed, Duplicati also crashed and wasn’t running anymore.


#3

The start and the end are a mystery, but the logged problem probably began in the delete done after backup per retention settings. If it can free up enough potential space, the default automatic compact runs to actually reclaim it by repackaging the scattered old blocks that are still useful. This does a decrypt, repack, reencrypt. Useful encryption has no expectations of file size, but decryption does, so somehow a file length got changed on its way out during an old backup, or while sitting at the storage destination, or on retrieval for decryption…

AES File Format

SharpAESCrypt

Code that probably made your message. Usually file size issues are truncations (often to powers of 2) but it’s not guaranteed. The error messages aren’t saying much that I can see about specific file names, lengths, etc. Repeatable problems can be run under enhanced logging (e.g. –log-level=Information) to see file downloads. Non-repeatable ones can sometimes be tracked down by a thorough search, but it can take lots of searching. Integrity checks can be spread out with a raised –backup-test-samples if that’s a more comfortable approach.


#4

So far it’s only occured this one time.

So not much to worry about for now :slight_smile:


#5

It happened again. The Duplicati server crashed and dissapeared this time too. I’m adding -log-level=information and we’ll see if that will give any clues next time it happens.

Failed: File length is invalid
Details: System.Security.Cryptography.CryptographicException: File length is invalid ---> System.IO.InvalidDataException: File length is invalid
   at SharpAESCrypt.SharpAESCrypt.Read(Byte[] buffer, Int32 offset, Int32 count)
   at Duplicati.Library.Utility.Utility.CopyStream(Stream source, Stream target, Boolean tryRewindSource, Byte[] buf)
   at Duplicati.Library.Encryption.EncryptionBase.Decrypt(Stream input, Stream output)
   at Duplicati.Library.Main.BackendManager.<>c__DisplayClass36_0.<coreDoGetPiping>b__0()
   at System.Threading.Tasks.Task.Execute()
   --- End of inner exception stack trace ---
   at Duplicati.Library.Main.AsyncDownloader.AsyncDownloaderEnumerator.AsyncDownloadedFile.get_TempFile()
   at Duplicati.Library.Main.Operation.CompactHandler.DoCompact(LocalDeleteDatabase db, Boolean hasVerifiedBackend, IDbTransaction& transaction, BackendManager sharedBackend)
   at Duplicati.Library.Main.Operation.DeleteHandler.DoRun(LocalDeleteDatabase db, IDbTransaction& transaction, Boolean hasVerifiedBacked, Boolean forceCompact, BackendManager sharedManager)
   at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired(BackendManager backend, Int64 lastVolumeSize)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass13_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)

Log data:
2018-12-29 03:25:17 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.Security.Cryptography.CryptographicException: File length is invalid ---> System.IO.InvalidDataException: File length is invalid
   at SharpAESCrypt.SharpAESCrypt.Read(Byte[] buffer, Int32 offset, Int32 count)
   at Duplicati.Library.Utility.Utility.CopyStream(Stream source, Stream target, Boolean tryRewindSource, Byte[] buf)
   at Duplicati.Library.Encryption.EncryptionBase.Decrypt(Stream input, Stream output)
   at Duplicati.Library.Main.BackendManager.<>c__DisplayClass36_0.<coreDoGetPiping>b__0()
   at System.Threading.Tasks.Task.Execute()
   --- End of inner exception stack trace ---
   at Duplicati.Library.Main.AsyncDownloader.AsyncDownloaderEnumerator.AsyncDownloadedFile.get_TempFile()
   at Duplicati.Library.Main.Operation.CompactHandler.DoCompact(LocalDeleteDatabase db, Boolean hasVerifiedBackend, IDbTransaction& transaction, BackendManager sharedBackend)
   at Duplicati.Library.Main.Operation.DeleteHandler.DoRun(LocalDeleteDatabase db, IDbTransaction& transaction, Boolean hasVerifiedBacked, Boolean forceCompact, BackendManager sharedManager)
   at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired(BackendManager backend, Int64 lastVolumeSize)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()

#6

That’s one of my favorites for not only getting additional context, but also getting info on destination uploads and downloads. Even without advance setup, you can also try looking in Job --> Show log --> Remote to see backend operation right around the time when the troublesome file seemingly got downloaded. The compact will probably get (meaning download) several dblock files so it can repackage contents more tightly into new dblock (and associated dindex) files to be put. You might be able to find a filename that way to study further. Very long shot outcome would be it’s the same filename as before, but I would have thought such a situation would have shown up on the next compact try from the next backup, and that wasn’t what actually happened.

From a code point of view, I think this is doing one (which one is unclear) of the InvalidFileLength errors from https://github.com/kenkendk/sharpaescrypt/blob/master/Source/SharpAESCrypt.cs

Possibly looking in the Windows temporary file area for right-timed files with dup* names would be worthwhile.

Obviously Duplicati.Server.exe is not supposed to vanish, and the Information log might help understand that.

Another monitor which I hesitate to jump right into (due to amount of data it might log) is Sysinternals Process Monitor. If the issue is ever narrowed down (or even better, created on demand), it can monitor things closely.


#7

Now it crashes each day, which might be good to be able to hunt down the problem. The very last items in job->Show log->Remote right up to the crash and Duplicati dissapearing are these lines:

* 31 dec 2018 03:20: delete duplicati-20181230T115255Z.dlist.zip.aes
* 31 dec 2018 03:20: delete duplicati-20181224T020000Z.dlist.zip.aes
* 31 dec 2018 03:19: put duplicati-20181231T020000Z.dlist.zip.aes
{"Size":2605,"Hash":"Xdis0mD5aPQdEesFAMPVBNXxgYxtuHEvOUxCDQ1Pr6Q="}
* 31 dec 2018 03:19: put duplicati-iaccfbc1703a54ae7a46392b1f0a7acb7.dindex.zip.aes
{"Size":607453,"Hash":"7b+szq9MNR3/d0kaqPczJeg8Ze6ekksXfqZCwhvtkDo="}
* 31 dec 2018 03:19: put duplicati-baafc60511d614d7ba6379de788f334d5.dblock.zip.aes
{"Size":51525741,"Hash":"RMxf5T/BZu4VORn5EBdfRzqJkI84yKEDE0xcNFGNGfw="}
* 31 dec 2018 03:19: put duplicati-id8a9501afd4044a9a8c12f5cc554f8ec.dindex.zip.aes

Sorry for a might be stupid question but where is the extended “information-log-level” log found?


#8

–log-file lets you pick the spot, and --log-level picks the level.
EDIT: For a quick peek, About --> Show log --> Live, and pick a level, but for this a file is probably better.


#9

Thanks!

Here’s from this mornings crash:

2019-01-01 03:30:00 +01 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2019-01-01 03:30:18 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2019-01-01 03:30:25 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (15,86 KB)
2019-01-01 03:34:45 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bd513ed1373b24cc2a7b1589f8a6ee253.dblock.zip.aes (49,96 MB)
2019-01-01 03:34:51 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bd513ed1373b24cc2a7b1589f8a6ee253.dblock.zip.aes (49,96 MB)
2019-01-01 03:34:51 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i9e38538b6a0c401ba0064d2ae3d0b983.dindex.zip.aes (6,98 KB)
2019-01-01 03:34:51 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i9e38538b6a0c401ba0064d2ae3d0b983.dindex.zip.aes (6,98 KB)
2019-01-01 03:34:51 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b8d4434c53a6941d39f03130f802641ed.dblock.zip.aes (49,87 MB)
2019-01-01 03:34:51 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b8d4434c53a6941d39f03130f802641ed.dblock.zip.aes (49,87 MB)
2019-01-01 03:34:51 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i247d6dc977524ec7b892bed3c954d5b7.dindex.zip.aes (1,37 MB)
2019-01-01 03:34:51 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i247d6dc977524ec7b892bed3c954d5b7.dindex.zip.aes (1,37 MB)
2019-01-01 03:34:51 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b615133f96faf48169cebf4d499b5af6b.dblock.zip.aes (49,33 MB)
2019-01-01 03:34:52 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b615133f96faf48169cebf4d499b5af6b.dblock.zip.aes (49,33 MB)
2019-01-01 03:34:52 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i4ecb8aa478ec49e3a3bfddae8779f3bd.dindex.zip.aes (706,86 KB)
2019-01-01 03:34:52 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i4ecb8aa478ec49e3a3bfddae8779f3bd.dindex.zip.aes (706,86 KB)
2019-01-01 03:34:52 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b1e6ad4ce51014252a854add86a29804f.dblock.zip.aes (49,98 MB)
2019-01-01 03:34:53 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b1e6ad4ce51014252a854add86a29804f.dblock.zip.aes (49,98 MB)
2019-01-01 03:34:53 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ica4a3d93564648ac96007a83d468cc02.dindex.zip.aes (6,45 KB)
2019-01-01 03:34:53 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b0eb422703b9443f0a5c3ddceb7530a98.dblock.zip.aes (49,91 MB)
2019-01-01 03:47:36 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i9574ef2a14694f16872c47993b68858e.dindex.zip.aes (4,23 KB)
2019-01-01 03:47:37 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i9574ef2a14694f16872c47993b68858e.dindex.zip.aes (4,23 KB)
2019-01-01 03:47:37 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bc4147b46388041cabf78819d54285b27.dblock.zip.aes (15,77 MB)
2019-01-01 03:47:37 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bc4147b46388041cabf78819d54285b27.dblock.zip.aes (15,77 MB)
2019-01-01 03:47:37 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ia51ac57779ae4a678b1b17129b6c35a3.dindex.zip.aes (591,20 KB)
2019-01-01 03:47:37 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ia51ac57779ae4a678b1b17129b6c35a3.dindex.zip.aes (591,20 KB)
2019-01-01 03:47:41 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20190101T023000Z.dlist.zip.aes (2,54 KB)
2019-01-01 03:47:41 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20190101T023000Z.dlist.zip.aes (2,54 KB)
2019-01-01 03:47:41 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed
2019-01-01 03:47:41 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00
2019-01-01 03:47:41 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupList]: Backups to consider: 2018-12-31 21:57:58, 2018-12-31 03:00:00, 2018-12-30 03:00:00, 2018-12-29 03:00:00, 2018-12-28 03:00:00, 2018-12-27 03:00:00, 2018-12-26 03:00:00, 2018-12-25 03:00:00, 2018-12-23 03:00:00, 2018-12-16 03:00:00, 2018-12-09 03:00:00, 2018-12-01 07:27:20
2019-01-01 03:47:41 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsToDelete]: Backups outside of all time frames and thus getting deleted: 
2019-01-01 03:47:41 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-AllBackupsToDelete]: All backups to delete: 2018-12-31 21:57:58, 2018-12-25 03:00:00
2019-01-01 03:47:41 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteRemoteFileset]: Deleting 2 remote fileset(s) ...
2019-01-01 03:47:48 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20181225T020000Z.dlist.zip.aes (2,54 KB)
2019-01-01 03:47:48 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20181225T020000Z.dlist.zip.aes (2,54 KB)
2019-01-01 03:47:48 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20181231T205758Z.dlist.zip.aes (2,54 KB)
2019-01-01 03:47:48 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20181231T205758Z.dlist.zip.aes (2,54 KB)
2019-01-01 03:47:48 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: Deleted 2 remote fileset(s)
2019-01-01 03:47:51 +01 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting because there is 28,63% wasted space and the limit is 25%
2019-01-01 03:48:17 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b59e185896bf543f39c53ca424910d61e.dblock.zip.aes (49,91 MB)
2019-01-01 03:48:18 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b59e185896bf543f39c53ca424910d61e.dblock.zip.aes (49,91 MB)
2019-01-01 03:48:18 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b524dd9b641574046ad8a0d877dd5d632.dblock.zip.aes (49,80 MB)
2019-01-01 03:48:18 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b524dd9b641574046ad8a0d877dd5d632.dblock.zip.aes (49,80 MB)
2019-01-01 03:48:18 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
2019-01-01 03:48:19 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
2019-01-01 03:48:29 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
2019-01-01 03:48:34 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
2019-01-01 03:48:44 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
2019-01-01 03:48:44 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
2019-01-01 03:48:54 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
2019-01-01 03:48:54 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
2019-01-01 03:49:04 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
2019-01-01 03:49:04 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Failed: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
2019-01-01 03:49:04 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b067629cc95ea4ed1a5117368a3b237da.dblock.zip.aes (49,41 MB)
2019-01-01 03:49:25 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.Security.Cryptography.CryptographicException: File length is invalid ---> System.IO.InvalidDataException: File length is invalid
   at SharpAESCrypt.SharpAESCrypt.Read(Byte[] buffer, Int32 offset, Int32 count)
   at Duplicati.Library.Utility.Utility.CopyStream(Stream source, Stream target, Boolean tryRewindSource, Byte[] buf)
   at Duplicati.Library.Encryption.EncryptionBase.Decrypt(Stream input, Stream output)
   at Duplicati.Library.Main.BackendManager.<>c__DisplayClass36_0.<coreDoGetPiping>b__0()
   at System.Threading.Tasks.Task.Execute()
   --- End of inner exception stack trace ---
   at Duplicati.Library.Main.AsyncDownloader.AsyncDownloaderEnumerator.AsyncDownloadedFile.get_TempFile()
   at Duplicati.Library.Main.Operation.CompactHandler.DoCompact(LocalDeleteDatabase db, Boolean hasVerifiedBackend, IDbTransaction& transaction, BackendManager sharedBackend)
   at Duplicati.Library.Main.Operation.DeleteHandler.DoRun(LocalDeleteDatabase db, IDbTransaction& transaction, Boolean hasVerifiedBacked, Boolean forceCompact, BackendManager sharedManager)
   at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired(BackendManager backend, Int64 lastVolumeSize)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()
2019-01-01 03:49:27 +01 - [Information-Duplicati.Library.Modules.Builtin.SendMail-SendMailComplete]: Email sent successfully using server: smtp://smtp.gmail.com:587/?starttls=when-available

#10

Here are the files currently in the temp directory:

 Volume in drive C is Boot
 Volume Serial Number is 848E-B472

 Directory of C:\Users\---\AppData\Local\Temp\1

2018-12-27  16:06    <DIR>          HttpServer
2018-12-27  16:06    <DIR>          WPF
2018-12-28  03:22               156 dup-4b7c8051-69b7-4da9-974f-f0e0b9490d13
2018-12-28  03:22               248 dup-f851600b-7e9f-45be-9db0-bd6499f76b17
2018-12-28  04:44               156 dup-0d7862f5-f185-46e8-95a7-a62a72ce47b4
2018-12-28  04:44               248 dup-3c88f2e5-6689-4ca1-9f0c-a80dd6dfad58
2018-12-29  03:21               156 dup-afd3d232-1b70-4151-b549-919ffeff42ba
2018-12-29  03:24        23 651 980 dup-99a1cc83-26b9-4105-8b2e-35f9bab52c76
2018-12-29  09:10               338 dup-da7669e6-c31e-4be2-844a-6b289cde3691
2018-12-29  09:19               232 dup-1c27f1f5-9a69-49f8-ae58-7042fb64b1d9
2018-12-29  09:19               232 dup-d813e6e0-314c-4f87-a755-7941e0c42420
2018-12-29  09:19               232 dup-b78b22b4-47da-4542-9705-53ddc06a9c46
2018-12-29  09:19               232 dup-b44beb89-bdcb-4bf1-8f3e-a72c81ec1e17
2018-12-29  09:19               232 dup-cc16936e-2249-4fa8-b666-f45a7b34f04c
2018-12-29  09:19               232 dup-4df8c2a8-deb9-4f2a-923e-30892ed870cd
2018-12-29  09:19               232 dup-7c26ab6e-22be-4692-a717-98d82ee72725
2018-12-29  09:19               232 dup-a297a1fb-5f40-4858-a869-e60516623a66
2018-12-29  09:19               232 dup-55b612d3-1bd1-4ca6-b700-e6beaa881bde
2018-12-29  09:19               232 dup-13413a5e-44f1-41af-a689-becb648158b7
2018-12-29  09:19               232 dup-193208d5-7824-4d60-b053-4a4b0ef1b576
2018-12-29  09:19               232 dup-fb609e6e-ef50-4179-b678-ab0f4eb38f93
2018-12-29  09:19               232 dup-29bf15f3-ed34-4f10-b98e-3847532ebf35
2018-12-29  09:19               232 dup-d8891785-8b0f-4cbd-b030-101eae0f4f1e
2018-12-29  09:19               232 dup-5a4fae76-927a-474c-b2a1-f9239bf0af40
2018-12-29  09:19               232 dup-32dd5450-2f27-4506-acd7-7301dc461544
2018-12-29  09:19               232 dup-06cc980c-1376-4cc4-9326-cd536d1a6267
2018-12-29  09:19               232 dup-30664a65-d697-4bc9-9318-916809507973
2018-12-29  09:19               232 dup-a591e234-f1ae-413c-a48a-38e4dd99afcf
2018-12-29  09:19               232 dup-218d8d92-2cae-4f2a-93ea-7b7e79f45357
2018-12-29  09:19               232 dup-62ad4074-29ae-417c-81b0-8a8e7f0e16ca
2018-12-29  09:19               232 dup-b6e27700-ef38-4018-b515-c6addb53427b
2018-12-29  09:19               232 dup-48c6dc93-659d-41fd-bb83-3991086907e9
2018-12-29  09:19               232 dup-104bca6c-9547-42e6-b386-72ef398d747f
2018-12-29  09:19               232 dup-99c90ccc-f51d-4068-b9be-47321d5ea0ca
2018-12-29  09:19               232 dup-51f45370-53d1-4c21-ae3b-6ff7c4bc7f2a
2018-12-29  09:19               232 dup-c579f388-e774-4450-9bc7-1f92ed2b3f55
2018-12-30  02:05               156 dup-352beeb1-c05c-4dae-b230-e408f8d3e090
2018-12-30  02:05               248 dup-3b6922eb-08b3-4588-8e60-eeabe10794eb
2018-12-30  03:27        23 651 980 dup-3fb6a5c0-b397-4054-aa1b-1725226b276e
2018-12-30  03:28               248 dup-fac97f5f-7c46-44e5-9740-a580317a0549
2018-12-31  01:31               156 dup-1352c8c5-e934-458b-a356-b2429752ff09
2018-12-31  02:00               248 dup-563c8d96-8b41-4c1f-849a-78d0e37b3934
2018-12-31  02:10               246 dup-ad26fcdf-b98a-46f9-a21f-3a490023395b
2018-12-31  02:10               154 dup-f9f96e74-0ad3-4c65-8b57-3cf91f587157
2018-12-31  03:20               156 dup-17d1fe32-0cc5-40eb-84c1-c7a4b9f6c886
2018-12-31  03:20        23 651 980 dup-6a25ca57-ad61-49c9-a395-b83e8f79ae93
2019-01-01  01:30               248 dup-34d3948a-28b5-4db6-8119-da9bf72960b0
2019-01-01  01:30               156 dup-f6d06a30-c539-40a8-a2fe-03d2d6f4978b
2019-01-01  02:05               154 dup-95b8f476-5e7b-4bb4-b137-7ea3c3d300b3
2019-01-01  02:05               246 dup-d095c4be-7641-45a1-a33e-388c616f8ec1
2019-01-01  03:47               156 dup-7d1d3c38-90e4-4295-8381-c4b4df1cb733
2019-01-01  03:48        23 651 980 dup-148dc46e-750b-4e6e-bcc6-3b89fcd468a2
2019-01-01  10:26               156 dup-1f62c262-102c-44ce-8662-398dec45bc9f
2019-01-01  10:26               248 dup-bcbb3550-be4e-4807-a486-c488f9bd3535
2019-01-01  10:31    <DIR>          .
2019-01-01  10:31    <DIR>          ..
              52 File(s)     94 618 462 bytes
               4 Dir(s)  475 897 933 824 bytes free

#11

Three new things visible here:

Get - Started: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)

was done 5 times (the default set by –number-of-retries), but it never worked. Does your storage fail often? You could crank the setting up to some huge value to attempt to ride out intermittencies, in case that helps. Checking logs (especially now that you have a nicer one) can also help, e.g. look for other Get - Failed. –retry-delay can also be raised as an experiment. Sometimes storage just needs more time for whatever…

Duplicati seemingly sent mail before it died. Did email actually show up? What should have happened next? Though crashes are rare, I’d ordinarily expect them to happen when something is going on, not in between. Was Duplicati a Windows service? I think service failures will show up in the event log with some information, otherwise there are other ways such as even more logging (Profiling level), or Sysinternals Process Monitor.

Temp files have an interesting repetitiveness (especially the large file), but random names impede studying.
Could you run PowerShell to try to get time in seconds? I’m not expert, but I think it would be something like:

Get-ChildItem dup-148dc46e-750b-4e6e-bcc6-3b89fcd468a2 | Select-Object Name, CreationTime, LastWriteTime

I’m looking at the download code, but I’m not sure how far I’ll get. If the experiments don’t help me, they may help some developer who knows the areas. I find it interesting that after the first file of the compact (or were there others earlier that weren’t shown?) failed to download, a second was tried, and exception came fast…


#12
  • Does your storage fail often?

Nope, no problems there whatsoever. I also supervise disk response times and I’ve never seen any problems whatsoever.

  • Did email actually show up? What should have happened next?

Yes email showed up so it sent it out before vanishing. Nothing should be done efter this job for a bunch of hours. The job has no run-at-end-of job tasks added, just bog standard Hyper-V backup job.

  • Was Duplicati a Windows service?

No, runs from startup folder.

Here’s what I found in Windows event logs (four items, two plus two):

2019-01-01 03:49:49

Application: Duplicati.GUI.TrayIcon.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.Exception
at Duplicati.Library.Logging.Log.get_CurrentScope()
at Duplicati.Library.Logging.Log.WriteMessage(Duplicati.Library.Logging.LogMessageType, System.String, System.String, System.Exception, System.String, System.Object)
at Duplicati.Library.Logging.Log.WriteRetryMessage(System.String, System.String, System.Exception, System.String, System.Object)
at Duplicati.Library.Main.BackendManager.ThreadRun()
at System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
at System.Threading.ThreadHelper.ThreadStart()

2019-01-01 03:49:49

Faulting application name: Duplicati.GUI.TrayIcon.exe, version: 2.0.4.5, time stamp: 0x5bfe8f74
Faulting module name: KERNELBASE.dll, version: 10.0.14393.2636, time stamp: 0x5bda7edc
Exception code: 0xe0434352
Fault offset: 0x0000000000034048
Faulting process id: 0xb0c
Faulting application start time: 0x01d4a1449cd8cc92
Faulting application path: C:\Program Files\Duplicati 2\Duplicati.GUI.TrayIcon.exe
Faulting module path: C:\Windows\System32\KERNELBASE.dll
Report Id: d8a0c2c5-4515-4879-9425-e9fe44039acf
Faulting package full name:
Faulting package-relative application ID:

And… again(!) just two minutes later

2019-01-02 03:52:08

Application: Duplicati.GUI.TrayIcon.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.Exception
at Duplicati.Library.Logging.Log.get_CurrentScope()
at Duplicati.Library.Logging.Log.WriteMessage(Duplicati.Library.Logging.LogMessageType, System.String, System.String, System.Exception, System.String, System.Object)
at Duplicati.Library.Logging.Log.WriteRetryMessage(System.String, System.String, System.Exception, System.String, System.Object)
at Duplicati.Library.Main.BackendManager.ThreadRun()
at System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
at System.Threading.ThreadHelper.ThreadStart()

2019-01-02 03:52:08

Faulting application name: Duplicati.GUI.TrayIcon.exe, version: 2.0.4.5, time stamp: 0x5bfe8f74
Faulting module name: KERNELBASE.dll, version: 10.0.14393.2636, time stamp: 0x5bda7edc
Exception code: 0xe0434352
Fault offset: 0x0000000000034048
Faulting process id: 0x1700
Faulting application start time: 0x01d4a1afa03f9e96
Faulting application path: C:\Program Files\Duplicati 2\Duplicati.GUI.TrayIcon.exe
Faulting module path: C:\Windows\System32\KERNELBASE.dll
Report Id: 9adc5a5b-6d5e-4c5a-bcca-49ba3f8310a7
Faulting package full name:
Faulting package-relative application ID:

File info with the better timestamps

> Name                                     CreationTime        LastWriteTime       Length
> dup-06cc980c-1376-4cc4-9326-cd536d1a6267 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-0d7862f5-f185-46e8-95a7-a62a72ce47b4 2018-12-28 04:44:06 2018-12-28 04:44:07 156
> dup-0e076106-bebf-4ae8-b4d8-de2ffab45ece 2019-01-02 08:57:27 2019-01-02 08:57:27 3829800
> dup-104bca6c-9547-42e6-b386-72ef398d747f 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-108be101-666e-4cc7-aa64-52e53ce9b072 2019-01-02 08:29:17 2019-01-02 08:29:40 156
> dup-13413a5e-44f1-41af-a689-becb648158b7 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-1352c8c5-e934-458b-a356-b2429752ff09 2018-12-31 01:31:34 2018-12-31 01:31:41 156
> dup-148dc46e-750b-4e6e-bcc6-3b89fcd468a2 2019-01-01 03:47:51 2019-01-01 03:48:20 23651980
> dup-17d1fe32-0cc5-40eb-84c1-c7a4b9f6c886 2018-12-31 03:20:08 2018-12-31 03:20:10 156
> dup-193208d5-7824-4d60-b053-4a4b0ef1b576 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-1c27f1f5-9a69-49f8-ae58-7042fb64b1d9 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-1f62c262-102c-44ce-8662-398dec45bc9f 2019-01-01 10:26:34 2019-01-01 10:26:34 156
> dup-218d8d92-2cae-4f2a-93ea-7b7e79f45357 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-269d40b0-c920-45b2-804f-63b5e45406b9 2019-01-02 09:05:50 2019-01-02 09:05:55 52060758
> dup-29bf15f3-ed34-4f10-b98e-3847532ebf35 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-30664a65-d697-4bc9-9318-916809507973 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-32dd5450-2f27-4506-acd7-7301dc461544 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-34d3948a-28b5-4db6-8119-da9bf72960b0 2019-01-01 01:30:58 2019-01-01 01:30:59 248
> dup-352beeb1-c05c-4dae-b230-e408f8d3e090 2018-12-30 02:05:41 2018-12-30 02:05:42 156
> dup-3b6922eb-08b3-4588-8e60-eeabe10794eb 2018-12-30 02:05:41 2018-12-30 02:05:43 248
> dup-3c88f2e5-6689-4ca1-9f0c-a80dd6dfad58 2018-12-28 04:44:06 2018-12-28 04:44:09 248
> dup-3fb6a5c0-b397-4054-aa1b-1725226b276e 2018-12-30 03:27:15 2018-12-30 03:27:37 23651980
> dup-48c6dc93-659d-41fd-bb83-3991086907e9 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-4b7c8051-69b7-4da9-974f-f0e0b9490d13 2018-12-28 03:22:51 2018-12-28 03:22:51 156
> dup-4df8c2a8-deb9-4f2a-923e-30892ed870cd 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-51f45370-53d1-4c21-ae3b-6ff7c4bc7f2a 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-55b612d3-1bd1-4ca6-b700-e6beaa881bde 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-563c8d96-8b41-4c1f-849a-78d0e37b3934 2018-12-31 01:31:34 2018-12-31 02:00:13 248
> dup-5a4fae76-927a-474c-b2a1-f9239bf0af40 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-5d3ad2e5-b829-4b3b-9c9e-a4dec11d9b37 2019-01-02 09:06:00 2019-01-02 09:06:03 51872712
> dup-62ad4074-29ae-417c-81b0-8a8e7f0e16ca 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-690d5609-67d3-4630-b95c-3e611ddf99af 2019-01-02 09:06:26 2019-01-02 09:06:27 52303325
> dup-6a25ca57-ad61-49c9-a395-b83e8f79ae93 2018-12-31 03:20:08 2018-12-31 03:20:33 23651980
> dup-70759db4-51d7-4515-9418-621e9b6a5fc2 2019-01-02 09:06:35 2019-01-02 09:06:37 22626829
> dup-71c8cc5c-9131-4962-bfed-ee98196d3988 2019-01-02 02:05:58 2019-01-02 02:06:00 153
> dup-7520719b-e83c-4866-8c52-d28e02d27e04 2019-01-02 03:49:57 2019-01-02 03:50:42 23651980
> dup-7541e76b-2281-4f7f-8426-5eef53df3eaf 2019-01-02 01:31:01 2019-01-02 01:31:02 247
> dup-7aef2e5b-9b43-4ad8-8de2-194befef695e 2019-01-02 09:06:08 2019-01-02 09:06:12 51956189
> dup-7c26ab6e-22be-4692-a717-98d82ee72725 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-7d1d3c38-90e4-4295-8381-c4b4df1cb733 2019-01-01 03:47:51 2019-01-01 03:47:51 156
> dup-90f46cd5-921d-4e77-8398-7e4c55d38e94 2019-01-02 02:05:58 2019-01-02 02:06:13 245
> dup-95b8f476-5e7b-4bb4-b137-7ea3c3d300b3 2019-01-01 02:05:52 2019-01-01 02:05:53 154
> dup-99a1cc83-26b9-4105-8b2e-35f9bab52c76 2018-12-29 03:21:01 2018-12-29 03:24:20 23651980
> dup-99c90ccc-f51d-4068-b9be-47321d5ea0ca 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-a297a1fb-5f40-4858-a869-e60516623a66 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-a591e234-f1ae-413c-a48a-38e4dd99afcf 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-a8f52776-d8b3-4cd5-9db0-973c4532bfe1 2019-01-02 09:06:26 2019-01-02 09:06:31 51886999
> dup-ad26fcdf-b98a-46f9-a21f-3a490023395b 2018-12-31 02:10:11 2018-12-31 02:10:12 246
> dup-afd3d232-1b70-4151-b549-919ffeff42ba 2018-12-29 03:21:01 2018-12-29 03:21:01 156
> dup-b44beb89-bdcb-4bf1-8f3e-a72c81ec1e17 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-b6e27700-ef38-4018-b515-c6addb53427b 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-b78b22b4-47da-4542-9705-53ddc06a9c46 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-bcbb3550-be4e-4807-a486-c488f9bd3535 2019-01-01 10:26:34 2019-01-01 10:26:35 248
> dup-c579f388-e774-4450-9bc7-1f92ed2b3f55 2018-12-29 09:19:52 2018-12-29 09:19:52 232
> dup-cc16936e-2249-4fa8-b666-f45a7b34f04c 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-cc556129-1821-4019-9361-56d12fcfe206 2019-01-02 01:31:01 2019-01-02 01:31:02 155
> dup-d095c4be-7641-45a1-a33e-388c616f8ec1 2019-01-01 02:05:52 2019-01-01 02:05:53 246
> dup-d813e6e0-314c-4f87-a755-7941e0c42420 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-d8891785-8b0f-4cbd-b030-101eae0f4f1e 2018-12-29 09:19:51 2018-12-29 09:19:51 232
> dup-da7669e6-c31e-4be2-844a-6b289cde3691 2018-12-29 09:09:30 2018-12-29 09:10:04 338
> dup-db5f4eeb-e0ab-4201-a085-401200cfee72 2019-01-02 09:06:35 2019-01-02 09:06:35 51938573
> dup-dfaf1ba9-8153-40c0-968a-d1774c8fb814 2019-01-02 03:49:57 2019-01-02 03:49:57 156
> dup-e60977c6-e4a9-40bd-88ad-2b77beceb30b 2019-01-02 09:06:17 2019-01-02 09:06:22 51978652
> dup-f6d06a30-c539-40a8-a2fe-03d2d6f4978b 2019-01-01 01:30:58 2019-01-01 01:30:59 156
> dup-f851600b-7e9f-45be-9db0-bd6499f76b17 2018-12-28 03:22:51 2018-12-28 03:22:54 248
> dup-f9f96e74-0ad3-4c65-8b57-3cf91f587157 2018-12-31 02:10:11 2018-12-31 02:10:12 154
> dup-fac97f5f-7c46-44e5-9740-a580317a0549 2018-12-30 03:27:15 2018-12-30 03:28:45 248
> dup-fb609e6e-ef50-4179-b678-ab0f4eb38f93 2018-12-29 09:19:51 2018-12-29 09:19:51 232

#13

Should I just try deleting the whole backup storage and start from scratch?

Or is there interest in digging a bit more trying to understand what’s going wrong?


#14

I still have some interest in trying to understand (partly because you made an excellent find in the event log), however any chase will probably not end in a direct code fix, but might be convertible into an issue in GitHub.

Unfortunately the final ending of System.Exception with get_CurrentScope at stack top is probably something beyond my knowledge of the code, but I can do a bit to identify the path leading up to that (which might help).

03:49:04 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Failed: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
03:49:04 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b067629cc95ea4ed1a5117368a3b237da.dblock.zip.aes (49,41 MB)
03:49:25 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
03:49:27 +01 - [Information-Duplicati.Library.Modules.Builtin.SendMail-SendMailComplete]: Email sent successfully using server: smtp://smtp.gmail.com:587/?starttls=when-available
(end of Information level log -- the rest is in event log)
03:49:49 WriteRetryMessage led to The process was terminated due to an unhandled exception.
03:52:08 WriteRetryMessage led to The process was terminated due to an unhandled exception.

ended almost (except for timing) like a continuation of the earlier retry pattern (five tries) leading to “Failed”.
The WriteRetryMessage format is like “Operation {0} with file {1} attempt {2} of {3} failed with message: {4}”
Below is an example from my Profiling level log. Raising your log level to there (or to Retry) would log more.

2018-12-05 11:51:45 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)
2018-12-05 11:52:09 -05 - [Profiling-Duplicati.Library.Main.BackendManager-DownloadSpeed]: Downloaded 273.68 KB in 00:00:24.0206847, 11.39 KB/s
2018-12-05 11:52:09 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)
2018-12-05 11:52:09 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationGet]: RemoteOperationGet took 0:00:00:24.022
2018-12-05 11:52:09 -05 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip attempt 4 of 5 failed with message: Hash mismatch on file "C:\WINDOWS\TEMP\dup-fa172149-0e78-4536-8fe8-184416f441cc", recorded hash: kX+m7GbjoomBrjyx9S83PDRtqKEsXiFB9UjNItn+tn4=, actual hash ZMu04I7abDfkQpbaEmtMfSazb1otEhgMrQan+skANHE=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\WINDOWS\TEMP\dup-fa172149-0e78-4536-8fe8-184416f441cc", recorded hash: kX+m7GbjoomBrjyx9S83PDRtqKEsXiFB9UjNItn+tn4=, actual hash ZMu04I7abDfkQpbaEmtMfSazb1otEhgMrQan+skANHE=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()
2018-12-05 11:52:09 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)
2018-12-05 11:52:19 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationGet]: Starting - RemoteOperationGet
2018-12-05 11:52:19 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)
2018-12-05 11:52:43 -05 - [Profiling-Duplicati.Library.Main.BackendManager-DownloadSpeed]: Downloaded 273.68 KB in 00:00:24.0197099, 11.39 KB/s
2018-12-05 11:52:43 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)
2018-12-05 11:52:43 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationGet]: RemoteOperationGet took 0:00:00:24.022
2018-12-05 11:52:43 -05 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip attempt 5 of 5 failed with message: Hash mismatch on file "C:\WINDOWS\TEMP\dup-fe2d0ecc-9f5d-4584-8e67-50c723c1a598", recorded hash: kX+m7GbjoomBrjyx9S83PDRtqKEsXiFB9UjNItn+tn4=, actual hash ZMu04I7abDfkQpbaEmtMfSazb1otEhgMrQan+skANHE=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\WINDOWS\TEMP\dup-fe2d0ecc-9f5d-4584-8e67-50c723c1a598", recorded hash: kX+m7GbjoomBrjyx9S83PDRtqKEsXiFB9UjNItn+tn4=, actual hash ZMu04I7abDfkQpbaEmtMfSazb1otEhgMrQan+skANHE=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()
2018-12-05 11:52:43 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Failed: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)

What this demonstrates is that “Get - Failed” can also be declared when the download to a temporary file doesn’t have the expected hash. It’s again not immediately known if records are wrong, destination had a problem that corrupted that file, something happened in upload or download, or something else caused it.

The other good question (which I can’t answer) is why a download is still retrying after job should be done. Before that, the question would be why the failure of a dblock to download just went on to another dblock. There is a similar bug of failed dblock upload not ending the backup. See link under DuplicatiVerify below.

Although we won’t know for sure if your case would show similar details (if you turned logging higher), the methods for a comprehensive search for destination file problems include the test command, with all for sample count. This is the same test that verification step after backup runs, except turned up and manual. Easiest way to run it is probably in job’s GUI Commandline interface, converting “backup” format for “test”. Raising –console-log-level to at least Retry seems wise. I’m not sure exactly what happens on found error.

Another way to have a comprehensive audit is more feasible if the destination can be directly read as files. Running a backup with –upload-verification-file will upload a duplicati-verification.json file after the backup. Running the DuplicatiVerify.ps1 (PowerShell) or DuplicatiVerify.py (Python) scripts will compare actual files. Sample output of a run is here. Source looks like hash fails will say *** Hash check failed for file: ", fullpath.

A non-comprehensive test would look at the specific file that failed after five tries. I don’t know how quickly your destination can give files, but for slow destinations, a comprehensive test of a big backup takes time. Service providers may have download fees. A targeted test is easier on the system, but harder on you. :wink:

To get a file, best is if your destination just lets you get it. If not, Duplicati.CommandLine.BackendTool.exe supports the storage types that Duplicati supports, although it might be harder to setup. You can find your storage URL in the job’s CommandLine “Target URL” box. After obtaining the file, you could indirectly test health by decrypting with SharpAESCrypt.exe in your Duplicati folder, or with AES Crypt. The resulting file (which might not happen) would be a zip file, which you can open or test. The file names inside will appear meaningless, though they’re actually based on the hash of that block that the backup put in the dblock file. For this path I suggest checking duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes

If above was way too much, you can certainly see if start-from-fresh works, or a compromise solution would involve a start-from-fresh on a copy of this job (export, import, change destination to another location, run), which might get your backup healthy faster (assuming you have the space) while we poke the troubled one. Backing up source to two destinations is not a problem, but don’t point multiple backups to one destination.

TL;DR
Which way you go is sort of up to you. You could also try pieces from above in some different arrangement. Helping understand would be a service to others who might see this, and maybe also lead to eventual fixes, however people sometimes just want to try to get back to normal ASAP, and it’s completely understandable.


#15

I have two destinations for this job. Well, actually I have two identical jobs with the ONLY difference being the destinations. One off-site computer running Minio (S3 “clone”) and a local computer set up exactly the same.

The local destination is the one crashing. The backup data is no longer identical between the two destinations since they’ve run different amount of times. The crashing job to the local computer hasn’t been able to run the job through since this started.

Anyway, I have easy and full access to all files, they are on a local virtual linux box. And since backup is running ok to the off-site backup server I can hang on to this problematic job a few days more, no problem.



I’ve upped retries to 15 and log-file-log-level to verbose, added –upload-verification-file and running the job now. I’ve only used the web GUI so preferably I do changes in the job settings there. I have no problem running from the command line if I get pretty complete info on the arguments.

This is the job (prior to adding some stuff mentioned above) export to command line (I just removed the email settings, accounts and passwords)

“C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe” backup

“s3://backup-hyper-v-live/?s3-server-name=backup%3A1111&s3-location-constraint=&s3-storage-class=&auth-username=xxxxxxx&auth-password=xxxxxx”

“%HYPERV%\221df34e-ea30-4efc-ad90-637816cb44f5”
“%HYPERV%\dea1a02d-10d6-4567-a6af-db8f5f81d5ab”
“%HYPERV%\34e77ea4-2dcf-474a-ab0b-d020717fc032”
“%HYPERV%\4973e322-c35d-4b40-b5fc-7fc9f3b7f2f9”

–backup-name=loc-backup-hyper-v-live
–dbpath=“C:\Users\Administrator\AppData\Local\Duplicati\YFFWBECKLP.sqlite”
–encryption-module=aes
–compression-module=zip
–dblock-size=50mb
–passphrase=xxxxxx
–retention-policy=“1W:1D,4W:1W,12M:1M”
–blocksize=700KB
–snapshot-policy=Required
–log-file-log-level=Information
–log-file=duplicati.log
–number-of-retries=15
–disable-module=console-password-input

So, exactly what can I add or change in the job settings and then what can I upload here to help?

A ls -la of the backup files on the Minio server for starters?

I’ve uploaded the DuplicatiVerify.py so will run that when this job… has crashed hehe.


#16

So… it was “Deleting unwanted files” and then… crash.

Ran DuplicatiVerify.py but got “No verification files in folder”

Enclosed:

  • textfile with all timestamp and sizes on the minio server files.zip (523.5 KB)

  • the duplicati logfile (verbose) duplicati.zip (7.1 KB)

The event log only had one of each error, not two as before?

2019-01-03 20:36:10

Application: Duplicati.GUI.TrayIcon.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.Exception
   at Duplicati.Library.Logging.Log.get_CurrentScope()
   at Duplicati.Library.Logging.Log.WriteMessage(Duplicati.Library.Logging.LogMessageType, System.String, System.String, System.Exception, System.String, System.Object[])
   at Duplicati.Library.Logging.Log.WriteRetryMessage(System.String, System.String, System.Exception, System.String, System.Object[])
   at Duplicati.Library.Main.BackendManager.ThreadRun()
   at System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
   at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
   at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
   at System.Threading.ThreadHelper.ThreadStart()

2019-01-03 20:36:100:

Faulting application name: Duplicati.GUI.TrayIcon.exe, version: 2.0.4.5, time stamp: 0x5bfe8f74
Faulting module name: KERNELBASE.dll, version: 10.0.14393.2636, time stamp: 0x5bda7edc
Exception code: 0xe0434352
Fault offset: 0x0000000000034048
Faulting process id: 0x1454
Faulting application start time: 0x01d4a39770c599c0
Faulting application path: C:\Program Files\Duplicati 2\Duplicati.GUI.TrayIcon.exe
Faulting module path: C:\Windows\System32\KERNELBASE.dll
Report Id: 61f19f63-ca57-4f9b-afed-e3e0c1379159
Faulting package full name: 
Faulting package-relative application ID: 

and this?
2019-01-03 20:36:14

Fault bucket 2149804879127711759, type 5
Event Name: CLR20r3
Response: Not available
Cab Id: 0

Problem signature:
P1: Duplicati.GUI.TrayIcon.exe
P2: 2.0.4.5
P3: 5bfe8f74
P4: Duplicati.Library.Logging
P5: 2.0.4.5
P6: 5bfe8f4f
P7: 27
P8: 3e
P9: System.Exception
P10: 

Attached files:
\\?\C:\ProgramData\Microsoft\Windows\WER\Temp\WER34AA.tmp.WERInternalMetadata.xml

These files may be available here:
C:\ProgramData\Microsoft\Windows\WER\ReportArchive\AppCrash_Duplicati.GUI.Tr_f44afef9c277fab7fac4f3b1995660cd9748_cba2fa62_19ca3a38

Analysis symbol: 
Rechecking for solution: 0
Report Id: 61f19f63-ca57-4f9b-afed-e3e0c1379159
Report Status: 0
Hashed bucket: 055744f4486a3e17cdd5a425bccf100f

Log data “Remote” looks like before:

* Jan 3, 2019 8:32 PM: delete duplicati-20190103T184313Z.dlist.zip.aes
* Jan 3, 2019 8:32 PM: put duplicati-20190103T191540Z.dlist.zip.aes
{"Size":2589,"Hash":"5+Z02uQwWEdA8QmyXjqC2lXIQN89bk4uVZuQHmtZhvE="}
* Jan 3, 2019 8:32 PM: put duplicati-ie0f514c3760248038f41b86701044b48.dindex.zip.aes
{"Size":3069,"Hash":"9kNdlEOcy4GQRG05ywDdD8CXPVFb1+gzG2tl8uX932o="}
* Jan 3, 2019 8:32 PM: put duplicati-bf08df326fe37449eb52d315612710ea4.dblock.zip.aes
{"Size":34840541,"Hash":"VXlrXm205ZwKLuMPbmnQJXNKUVBcNKCW5bNsp2S8WoI="}
* Jan 3, 2019 8:32 PM: put duplicati-i558b57f6ebd042879d8dd72a981278ec.dindex.zip.aes
{"Size":4573,"Hash":"tDwlf8wWc+yarWlzSeTo69OALmsLaD6AaTMhHjjGXhc="}
* Jan 3, 2019 8:32 PM: put duplicati-b4bb6eb0dfd7447559d3a2c41907dcb17.dblock.zip.aes
{"Size":52300957,"Hash":"jF07CaS1oto7ROTtHmD4Hs7Sd4EF0j27WFF0WCuE/mo="}
* Jan 3, 2019 8:32 PM: put duplicati-i8e5734248f164caf8bcded53b5e1aac1.dindex.zip.aes
{"Size":720957,"Hash":"O+TTFhBKNtVIx2/Pz7OtLcMAP5oKCranFn3PY8O4d+w="} and so on.....

And finally the job exported to command line (email stuff removed)

"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe" backup

"s3://backup-hyper-v-live/?s3-server-name=backup%3A1111&s3-location-constraint=&s3-storage-class=&auth-username=xxxxx&auth-password=xxxxxx"

"%HYPERV%\221df34e-ea30-4efc-ad90-637816cb44f5"
"%HYPERV%\dea1a02d-10d6-4567-a6af-db8f5f81d5ab"
"%HYPERV%\34e77ea4-2dcf-474a-ab0b-d020717fc032"
"%HYPERV%\4973e322-c35d-4b40-b5fc-7fc9f3b7f2f9"

--backup-name=loc-backup-hyper-v-live
--dbpath="C:\Users\Administrator\AppData\Local\Duplicati\YFFWBECKLP.sqlite"
--encryption-module=aes
--compression-module=zip
--dblock-size=50mb
--passphrase=xxxxxxx
--retention-policy="1W:1D,4W:1W,12M:1M"
--blocksize=700KB
--snapshot-policy=Required
--log-file-log-level=Verbose
--log-file=duplicati.log
--number-of-retries=15
--upload-verification-file=true
--disable-module=console-password-input

#17

Yes, it keeps trying the same compact, and it has the same download problem, but now the view is better.

The two main problems are:

  1. Because duplicati-verification.json is uploaded after a backup, and yours didn’t finish, it’s not there.

  2. It’s hard to get DuplicatiVerify.py to show its help. Giving it too many arguments will do it though. It’s

DuplicatiVerify.py <path to verification file>
DuplicatiVerify.py <folder with verification files>
DuplicatiVerify.py (no arguments, uses current dir)

To fix the first problem, you could try adding –no-auto-compact long enough to finish, then remove it again.
To fix the second, you probably must either point the script to the backup, or cd to backup to run the script.

I’m not an expert in Windows internal practices, but some searching makes me think Buckets speaks to that.

The main advantage of logging at Verbose was that it’s higher than Retry which added info not seen before.

2019-01-03 20:32:53 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49.48 MB)
2019-01-03 20:32:53 +01 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes attempt 1 of 15 failed with message: File length is invalid
2019-01-03 20:32:53 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49.48 MB)
...
2019-01-03 20:35:21 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49.48 MB)
2019-01-03 20:35:26 +01 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes attempt 15 of 15 failed with message: File length is invalid
2019-01-03 20:35:26 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Failed: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49.48 MB)
2019-01-03 20:35:26 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b067629cc95ea4ed1a5117368a3b237da.dblock.zip.aes (49.41 MB)
2019-01-03 20:35:46 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.Security.Cryptography.CryptographicException: File length is invalid ---> System.IO.InvalidDataException: File length is invalid
(end of Verbose level log -- the rest is in event log)
2019-01-03 20:36:10 System.Exception during WriteRetryMessage ending at at Duplicati.Library.Logging.Log.get_CurrentScope()

Seeing it error duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes 15 times makes it suspect. Sometimes damaged files have sizes rounded to powers of 2, but this one did not (nor did other suspects).
Minio seems to only support S3-type access, otherwise I’d suggest trying another method to see if it helps. Possibly you could download that file some other way for decryption, or maybe download AES Crypt there.

It looks like you might have run “Create bug report”. If so, that’s an alternative way to check the file record, looking it up by Name in the RemoteVolume table, and taking Size and Hash which is Base64 encoded but Cryptii can convert. Enter Text, then Decode to Bytes, Hexadecimal, Group by None, or whatever matches whatever SHA256 tool you choose. On Linux, sha256sum looks suitable. On Windows, perhaps Hash Tool.

A third way (first was looking in duplicati-verification.json, second was above) of finding size and hash is to look for duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes in the GUI with job’s Remote logs, then click on the file name to get its additional information. The earliest date should be the upload making

-rw-r–r-- 1 root root 51882765 2018-12-11 03:17:30.792292547 +0100 duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes

and others would probably be download tries. I’m not sure if you’ll see its many retries, but uploads do so. Getting very lucky would show not only the failing retry, but also show different size and hash from upload. Here, “lucky” means it was easy, explains the reported error, but still leaves us wondering how it got there.
Single bad dblock files can be dealt with though, and the affected command can even show impact of loss. Before getting too optimistic on that, note that there’s been no DuplicatiVerify or big test command run yet.


#18

I sort of skimmed over your discussion with @magnust and it sounds like you’re doing all the right things.

I apologize if this was already covered, but if you (temporarily) change the retention policy to “keep all” does the error go away?

I’m also wondering if the email even though Duplicati crashed scenario might be a bug in the multi-threading where one thread thinks the process is done (so send an email) while another thread is still going (let’s do retention clean-up).

Do you know if --no-auto-compact is honored during retention cleanup?


#19

Ordinarily, avoiding deletes would avoid compact because no wasted space is otherwise created except for unusual manual work like purge. In the current situation the compact trigger is already hit (but never fixed). Compact probably runs quite often after retention deletes because that’s a very common way to get space:

2019-01-03 20:32:10 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed
2019-01-03 20:32:10 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00
2019-01-03 20:32:10 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupList]: Backups to consider: 2019-01-03 19:43:13, 2019-01-03 07:39:34, 2019-01-02 03:30:00, 2019-01-01 03:30:00, 2018-12-31 03:00:00, 2018-12-30 03:00:00, 2018-12-29 03:00:00, 2018-12-28 03:00:00, 2018-12-23 03:00:00, 2018-12-16 03:00:00, 2018-12-09 03:00:00, 2018-12-01 07:27:20
2019-01-03 20:32:10 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsToDelete]: Backups outside of all time frames and thus getting deleted: 
2019-01-03 20:32:10 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-AllBackupsToDelete]: All backups to delete: 2019-01-03 19:43:13
2019-01-03 20:32:10 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteRemoteFileset]: Deleting 1 remote fileset(s) ...
2019-01-03 20:32:15 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20190103T184313Z.dlist.zip.aes (2.53 KB)
2019-01-03 20:32:15 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20190103T184313Z.dlist.zip.aes (2.53 KB)
2019-01-03 20:32:15 +01 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: Deleted 1 remote fileset(s)
2019-01-03 20:32:19 +01 - [Verbose-Duplicati.Library.Main.Database.LocalDeleteDatabase-FullyDeletableCount]: Found 0 fully deletable volume(s)
2019-01-03 20:32:19 +01 - [Verbose-Duplicati.Library.Main.Database.LocalDeleteDatabase-SmallVolumeCount]: Found 2 small volumes(s) with a total size of 6.56 MB
2019-01-03 20:32:19 +01 - [Verbose-Duplicati.Library.Main.Database.LocalDeleteDatabase-WastedSpaceVolumes]: Found 3800 volume(s) with a total of 31,56% wasted space (211.68 GB of 670.70 GB)
2019-01-03 20:32:19 +01 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting because there is 31,56% wasted space and the limit is 25%

Possibly preventing delete will prevent compact, but it was easier to just directly tell it to not do the compact.

Code that immediately follows delete selection and delete looks like lack of delete also suppresses compact.

EDIT:

Yes, I think there’s a lack of coordination on when it’s time to call things done. I’m not sure if this would be in the 2.0.3.6 increase in concurrency, or if possibly the problem existed before, but somehow wasn’t detected.


#20

Running a test with keep all right now. Will get back on results.

Why DuplicatiVerify.py fails is because there is no file in the directory. So next thing I’ll try tonight is –no-auto-compact to see if I get the verification file uploaded and then I’ll run the verify python tool.