Help: Hash mismatch error

I have a small backup (just some files) running from local UNC-Drives to GoogleDrive. Backup runs every 93 minutes. Sometimes it is ok but sometimes there are errors reported.

Problem with that: Tonight I had some successful runs and some with error. The data hasn’t changed because I slept in bed

2018-01-08 19:58:40 No error
2018-01-08 21:12:18 No error
2018-01-08 23:40:05 No error => since here I was in bed, so no locked files or so
2018-01-09 00:17:00 error

Failed to process file duplicati-20180101T194456Z.dlist.zip.aes => Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-78116ddc-ee20-4ba8-8bd9-62c9f5de9f94",
recorded hash: ujWYpxRbVGCXD0lRhBrgONWOpWwYN/yZ9Td147iiNyc=, actual hash 6V+/lNSIeuMw7TWFHcSzYsK+iWDVNU7ETblu6NL12Ig=

2018-01-09 01:50:00 same error with different Temp\dup-… file
2018-01-09 03:23:44 no error!

2018-01-09 04:56:00 error

Failed to process file duplicati-20180101T194456Z.dlist.zip.aes => Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-8585f2ae-c280-483e-904b-e41facda6a4e",
recorded hash: ujWYpxRbVGCXD0lRhBrgONWOpWwYN/yZ9Td147iiNyc=, actual hash 6V+/lNSIeuMw7TWFHcSzYsK+iWDVNU7ETblu6NL12Ig=

and so it continues.

Why do I get this hash mismatch errors?

Hey @thommyX. I would suggest you exclude the %TEMP% folder from your backup altogether. That should get rid of your error.

Hello samw,

this temp folder is not part of the backup! My backup contains just a KeepassPortable folder and some batchfiles but not this temp folder.

The file in the error message is located in “c:\users\server\AppData\Local\Temp” and this file is created by Duplicati.

Verifications: [
    Key: duplicati-20180101T194456Z.dlist.zip.aes
    Value: [
        Key: Error
        Value: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-bc73effb-494f-4128-8794-7a4bfe8f5057", recorded hash: ujWYpxRbVGCXD0lRhBrgONWOpWwYN/yZ9Td147iiNyc=, actual hash 6V+/lNSIeuMw7TWFHcSzYsK+iWDVNU7ETblu6NL12Ig=
    ],
    Key: duplicati-i51c100550d7446d1855754bcd8d0df15.dindex.zip.aes
    Value: [],
    Key: duplicati-b5bff8332cba44685ae2d25530624815c.dblock.zip.aes
    Value: []
]

My source folders are UNC-Paths and not C:\ and non of them contains a file from the user-folder (server is the username of the user that runs Duplicati) nor from the same disk or from the same host! (both are different hosts than the one Duplicati is running):

\\golem\sandisk64_r\myportables\KeePass2Portable\
\\thorn2\C_R\data\serverbat\

Btw: The backup was started again and finished successful without any change.
Then I started it now manual and got this:

image

So again I had a success and then a fail (reported as warning but in results showing an error?!?):

DeletedFiles: 0
DeletedFolders: 0
ModifiedFiles: 0
ExaminedFiles: 643
OpenedFiles: 0
AddedFiles: 0
SizeOfModifiedFiles: 0
SizeOfAddedFiles: 0
SizeOfExaminedFiles: 173905407
SizeOfOpenedFiles: 0
NotProcessedFiles: 0
AddedFolders: 0
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
CompactResults: null
DeleteResults: null
RepairResults: null
TestResults:
    MainOperation: Test
    Verifications: [
        Key: duplicati-20180101T194456Z.dlist.zip.aes
        Value: [
            Key: Error
            Value: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-4a40ea46-fd0f-48db-b9df-af833743ae12", recorded hash: ujWYpxRbVGCXD0lRhBrgONWOpWwYN/yZ9Td147iiNyc=, actual hash 6V+/lNSIeuMw7TWFHcSzYsK+iWDVNU7ETblu6NL12Ig=
        ],
        Key: duplicati-if536338395004abaad835877ef6a941f.dindex.zip.aes
        Value: [],
        Key: duplicati-b9b54cd9f00de4f4285b5d4b36c1b3427.dblock.zip.aes
        Value: []
    ]
    ParsedResult: Success
    EndTime: 09.01.2018 12:07:31 (1515496051)
    BeginTime: 09.01.2018 12:06:34 (1515495994)
    Duration: 00:00:56.5283660
    BackendStatistics:
        RemoteCalls: 9
        BytesUploaded: 0
        BytesDownloaded: 3182859
        FilesUploaded: 0
        FilesDownloaded: 7
        FilesDeleted: 0
        FoldersCreated: 0
        RetryAttempts: 4
        UnknownFileSize: 0
        UnknownFileCount: 0
        KnownFileCount: 119
        KnownFileSize: 97503323
        LastBackupDate: 09.01.2018 11:08:00 (1515492480)
        BackupListCount: 81
        TotalQuotaSpace: 16106127360
        FreeQuotaSpace: 104901199
        AssignedQuotaSpace: -1
        ParsedResult: Success
ParsedResult: Error
EndTime: 09.01.2018 12:07:31 (1515496051)
BeginTime: 09.01.2018 12:06:17 (1515495977)
Duration: 00:01:14.6986444
Messages: [
    removing file listed as Temporary: duplicati-b2249e73b81ed4829bfe36ae2425db900.dblock.zip.aes,
    removing file listed as Temporary: duplicati-iffd0e06011ea4483885edb46157711ec.dindex.zip.aes
]
Warnings: []
Errors: [
    Failed to process file duplicati-20180101T194456Z.dlist.zip.aes => Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-4a40ea46-fd0f-48db-b9df-af833743ae12", recorded hash: ujWYpxRbVGCXD0lRhBrgONWOpWwYN/yZ9Td147iiNyc=, actual hash 6V+/lNSIeuMw7TWFHcSzYsK+iWDVNU7ETblu6NL12Ig=
]

After the job I have many dup… files in this temp-folder but non of them with this dup-4a name:

So another try:

  • Deleted all dup* files in this temp folder
  • Started the job
  • Job finished with error
Errors: [
    Failed to process file duplicati-20180101T194456Z.dlist.zip.aes => Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-fc751b19-1d4c-4ada-bbb2-5baf573e3a2d", recorded hash: ujWYpxRbVGCXD0lRhBrgONWOpWwYN/yZ9Td147iiNyc=, actual hash 6V+/lNSIeuMw7TWFHcSzYsK+iWDVNU7ETblu6NL12Ig=
  • After the job a file dupl-usagereport-3596-20180109111630 exists with this content
{"uid":"573d1aa3da274da4836df350b442fdcc","ostype":"Windows","osversion":"Microsoft Windows NT 6.1.7601 Service Pack 1","clrversion":".Net 4.0.30319.42000","appname":"Duplicati","appversion":"2.0.2.13","setid":"ada5730e4a7b431db5d7c8d8e8b76e81","assembly":"Duplicati.GUI.TrayIcon","items":[{"timestamp":1515496590,"type":"Information","count":77,"name":"BACKUP_DURATION","data":"77"}]}

As you can see it still sometimes runs without error and sometimes with error:

Now I have identified a second job with such a problem.

Job reported before is running with Google Drive as target, this one uses hubiC as target. So different cloud providers.

As you can see nothing changed (no new data, no modified data) but 2 times there was a hash mismatch error! Why only this 2 executions and all others were fine???

The second one had an error and then I started it manual and is was fine?!?

What can I do to get rid of this errors? If such an error occurs and then doesn’t occur and then it occurs again I am not sure if I can trust the software?!?

Here are 2 outputs, one with error and one without. The reported hashes are the same in each error run:

Error

DeletedFiles: 0
DeletedFolders: 0
ModifiedFiles: 0
ExaminedFiles: 2882
OpenedFiles: 66
AddedFiles: 0
SizeOfModifiedFiles: 0
SizeOfAddedFiles: 0
SizeOfExaminedFiles: 15627526590
SizeOfOpenedFiles: 219909208
NotProcessedFiles: 0
AddedFolders: 0
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
CompactResults: null
DeleteResults:
    DeletedSets: []
    Dryrun: False
    MainOperation: Delete
    CompactResults: null
    ParsedResult: Success
    EndTime: 10.01.2018 09:12:53
    BeginTime: 10.01.2018 09:12:52
    Duration: 00:00:01.1775180
    BackendStatistics:
        RemoteCalls: 9
        BytesUploaded: 0
        BytesDownloaded: 78291451
        FilesUploaded: 0
        FilesDownloaded: 7
        FilesDeleted: 0
        FoldersCreated: 0
        RetryAttempts: 4
        UnknownFileSize: 0
        UnknownFileCount: 0
        KnownFileCount: 2089
        KnownFileSize: 16099928997
        LastBackupDate: 08.01.2018 10:37:14
        BackupListCount: 21
        TotalQuotaSpace: 0
        FreeQuotaSpace: 0
        AssignedQuotaSpace: -1
        ParsedResult: Success
RepairResults: null
TestResults:
    MainOperation: Test
    Verifications: [
        Key: duplicati-20180105T174530Z.dlist.zip.aes
        Value: [],
        Key: duplicati-i7655205a00814fa7b5d2cd16152cb930.dindex.zip.aes
        Value: [],
        Key: duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes
        Value: [
            Key: Error
            Value: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-9377faa0-3df1-4056-b264-1b2de2076662", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
        ]
    ]
    ParsedResult: Success
    EndTime: 10.01.2018 09:14:57
    BeginTime: 10.01.2018 09:12:58
    Duration: 00:01:59.1543254
ParsedResult: Error
EndTime: 10.01.2018 09:14:57
BeginTime: 10.01.2018 09:12:00
Duration: 00:02:57.8052239
Messages: [
    No remote filesets were deleted,
    removing file listed as Temporary: duplicati-b1594069909b448188a6c9a2a8bd263f9.dblock.zip.aes,
    removing file listed as Temporary: duplicati-ia7c74726eca84e9eb9339c25bc7266ff.dindex.zip.aes
]
Warnings: []
Errors: [
    Failed to process file duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes => Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-9377faa0-3df1-4056-b264-1b2de2076662", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
]

Success

DeletedFiles: 0
DeletedFolders: 0
ModifiedFiles: 0
ExaminedFiles: 2882
OpenedFiles: 66
AddedFiles: 0
SizeOfModifiedFiles: 0
SizeOfAddedFiles: 0
SizeOfExaminedFiles: 15627526590
SizeOfOpenedFiles: 219909208
NotProcessedFiles: 0
AddedFolders: 0
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
CompactResults: null
DeleteResults:
    DeletedSets: []
    Dryrun: False
    MainOperation: Delete
    CompactResults: null
    ParsedResult: Success
    EndTime: 10.01.2018 09:26:56
    BeginTime: 10.01.2018 09:26:54
    Duration: 00:00:01.7525269
    BackendStatistics:
        RemoteCalls: 5
        BytesUploaded: 0
        BytesDownloaded: 13568791
        FilesUploaded: 0
        FilesDownloaded: 3
        FilesDeleted: 0
        FoldersCreated: 0
        RetryAttempts: 0
        UnknownFileSize: 0
        UnknownFileCount: 0
        KnownFileCount: 2089
        KnownFileSize: 16099928997
        LastBackupDate: 08.01.2018 10:37:14
        BackupListCount: 21
        TotalQuotaSpace: 0
        FreeQuotaSpace: 0
        AssignedQuotaSpace: -1
        ParsedResult: Success
RepairResults: null
TestResults:
    MainOperation: Test
    Verifications: [
        Key: duplicati-20180106T083234Z.dlist.zip.aes
        Value: [],
        Key: duplicati-ib4dd4773fa1148bab499bc87faeffe22.dindex.zip.aes
        Value: [],
        Key: duplicati-bf826a7757d124a529cecc16fc7530cdd.dblock.zip.aes
        Value: []
    ]
    ParsedResult: Success
    EndTime: 10.01.2018 09:27:09
    BeginTime: 10.01.2018 09:26:58
    Duration: 00:00:10.8326659
ParsedResult: Success
EndTime: 10.01.2018 09:27:09
BeginTime: 10.01.2018 09:26:15
Duration: 00:00:54.1533297
Messages: [
    No remote filesets were deleted,
    removing file listed as Temporary: duplicati-beacbe09d995e470490350d0ad56aca04.dblock.zip.aes,
    removing file listed as Temporary: duplicati-i4ccdaca07d7a4163ad0a5bbe1e48be8d.dindex.zip.aes
]
Warnings: []
Errors: []

I believe the backup success vs. warning/error issues is due to the backup working (success) but then during the testing step the hash mismatch is found (error).

All the errors make it seem like somehow the temp folder is being included in the backup. A couple things to try to help confirm (or exclude) that would be:

  • pretend you’re going to do a restore and see if you can filter (or browse) to C:\Users\server\AppData\Local\Temp\ (if you can, then the files are in the backup)
  • post your backup job via “Export …” then “As Command-line …” (be sure to remove/ofuscate any personally identifiable parameters like user names, email addresses, passwords, etc.)

I’ve had the same problem and it was gone after setting the scheduler from 15 minutes to 30 minutes. And it solved other issues too. Maybe try this.

Hello JonMikeIV,

in the restore I do not have c! The first job has 2 UNC-Paths (to other machines, maybe you remember my thread with the restore gui and 2 UNC-Paths, that is this job) and the second job has one UNC-Path to a photo-folder on also an other machine.

Here a screenshot:

The other job has also no temp in the restore!

Here is the exported commandline (I have added linebreaks for easier reading):

"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe"
backup
"hubic://default/xxxxxxxxx?authid=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
"\\golem\localdata_r\photo\2012\\"
--backup-name="hubiC   <= photo"
--dbpath="C:\Users\server\AppData\Local\Duplicati\hubiC.sqlite"
--encryption-module="aes"
--compression-module="zip"
--dblock-size="15MB"
--keep-time="1Y"
--passphrase="xxxxxxxxxxxxxxxxxxxxxxxxxx"
--send-http-url="https://www.duplicati-monitoring.com/log/xxxxxx/xxxxxxxxxxxxxxx/xxx"
--disable-module="console-password-input"

Hello skoch,

thanks for your suggestion but I don’t believe that the scheduler intervall has something to do with it. One of the jobs with this problem runs every 93 minutes, the other one every 699 minutes.

I have 22 jobs in total with intervall from 45 minutes to 1200 minutes and at the moment only 2 jobs habe with this problem (and both only have the problem sometimes).

Thanks for posting the job export - I think I may have misunderstood the error. This particular “hash mismatch” error only seems to come from one place in the code - when getting files from the destination and saving them locally as a temp file. (Assuming I’m reading things correctly.)

So I think what’s being reported is that a dlist file (a list of what’s been stored in a specific dblock file in the destination) has been downloaded into the temp folder but the resulting hash validation of the downloaded file doesn’t match what the local database was expecting so Duplicati treats the file contents as suspect.

My guess is that in the logs just before the hash error you’ll see a message similar to “Downloaded xxx MB in yyy seconds, zzz MB/s” (it might also start with “Downloaded and decrypted…”). If so, then it looks like the file being downloaded is the right size (otherwise you’d be seeing a different error) but the resulting hash of the file doesn’t match for some reason.

Assuming all of that is correct, it might be time to pull @kenkendk or somebody more familiar with the hashing checks into the conversation…

Just out of curiosity, do things work as expected if you enable the --no-backend-verification parameter? (Note that this isn’t a fix so much as a check for narrowing down when/where the error is happening.)

Version: Duplicati - 2.0.2.1_beta_2017-08-01

Here are the logs (Jobs, Logs => General + Remote). Is there another log anywhere on disk or so or can I change the loglevel somewhere to get more detailed information if this happens the next time?

General

Jan 10, 2018 10:14 AM: Result
DeletedFiles: 0
DeletedFolders: 0
ModifiedFiles: 0
ExaminedFiles: 2882
OpenedFiles: 66
AddedFiles: 0
SizeOfModifiedFiles: 0
SizeOfAddedFiles: 0
SizeOfExaminedFiles: 15627526590
SizeOfOpenedFiles: 219909208
NotProcessedFiles: 0
AddedFolders: 0
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
CompactResults: null
DeleteResults:
    DeletedSets: []
    Dryrun: False
    MainOperation: Delete
    CompactResults: null
    ParsedResult: Success
    EndTime: 10.01.2018 09:12:53
    BeginTime: 10.01.2018 09:12:52
    Duration: 00:00:01.1775180
    BackendStatistics:
        RemoteCalls: 9
        BytesUploaded: 0
        BytesDownloaded: 78291451
        FilesUploaded: 0
        FilesDownloaded: 7
        FilesDeleted: 0
        FoldersCreated: 0
        RetryAttempts: 4
        UnknownFileSize: 0
        UnknownFileCount: 0
        KnownFileCount: 2089
        KnownFileSize: 16099928997
        LastBackupDate: 08.01.2018 10:37:14
        BackupListCount: 21
        TotalQuotaSpace: 0
        FreeQuotaSpace: 0
        AssignedQuotaSpace: -1
        ParsedResult: Success
RepairResults: null
TestResults:
    MainOperation: Test
    Verifications: [
        Key: duplicati-20180105T174530Z.dlist.zip.aes
        Value: [],
        Key: duplicati-i7655205a00814fa7b5d2cd16152cb930.dindex.zip.aes
        Value: [],
        Key: duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes
        Value: [
            Key: Error
            Value: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-9377faa0-3df1-4056-b264-1b2de2076662", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
        ]
    ]
    ParsedResult: Success
    EndTime: 10.01.2018 09:14:57
    BeginTime: 10.01.2018 09:12:58
    Duration: 00:01:59.1543254
ParsedResult: Error
EndTime: 10.01.2018 09:14:57
BeginTime: 10.01.2018 09:12:00
Duration: 00:02:57.8052239
Messages: [
    No remote filesets were deleted,
    removing file listed as Temporary: duplicati-b1594069909b448188a6c9a2a8bd263f9.dblock.zip.aes,
    removing file listed as Temporary: duplicati-ia7c74726eca84e9eb9339c25bc7266ff.dindex.zip.aes
]
Warnings: []
Errors: [
    Failed to process file duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes => Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-9377faa0-3df1-4056-b264-1b2de2076662", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
]

Jan 10, 2018 10:14 AM: Message
Failed to process file duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-9377faa0-3df1-4056-b264-1b2de2076662", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
   at Duplicati.Library.Main.BackendManager.GetForTesting(String remotename, Int64 size, String hash)
   at Duplicati.Library.Main.Operation.TestHandler.DoRun(Int64 samples, LocalTestDatabase db, BackendManager backend)

Jan 10, 2018 10:14 AM: Retry
Operation Get with file duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes attempt 5 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-9377faa0-3df1-4056-b264-1b2de2076662", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-9377faa0-3df1-4056-b264-1b2de2076662", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
   at Duplicati.Library.Main.BackendManager.GetForTesting(String remotename, Int64 size, String hash)
   at Duplicati.Library.Main.Operation.TestHandler.DoRun(Int64 samples, LocalTestDatabase db, BackendManager backend)

Jan 10, 2018 10:14 AM: Retry
Operation Get with file duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes attempt 4 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-c8becd6f-0217-4902-a4f4-1c1ac94dc81d", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-c8becd6f-0217-4902-a4f4-1c1ac94dc81d", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()

Jan 10, 2018 10:14 AM: Retry
Operation Get with file duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes attempt 3 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-e094bb6d-92b2-4b8f-b645-5f062dfc621c", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-e094bb6d-92b2-4b8f-b645-5f062dfc621c", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()

Jan 10, 2018 10:14 AM: Retry
Operation Get with file duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes attempt 2 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-c75e39c8-fccb-42b0-8efc-20987b0df55a", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-c75e39c8-fccb-42b0-8efc-20987b0df55a", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()

Jan 10, 2018 10:14 AM: Retry
Operation Get with file duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes attempt 1 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-599b7bc0-1c95-4316-b748-807e3ab453bd", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-599b7bc0-1c95-4316-b748-807e3ab453bd", recorded hash: J19j3q19Cgp4rYALwHnwKNVeJXtI+XWdq1MI7zVPzPk=, actual hash yHLo+ym+I/nNS862nzELotbQk8NEDMZ4bkl0/2nqsEA=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()

Jan 10, 2018 10:12 AM: Message
removing file listed as Temporary: duplicati-ia7c74726eca84e9eb9339c25bc7266ff.dindex.zip.aes

Jan 10, 2018 10:12 AM: Message
removing file listed as Temporary: duplicati-b1594069909b448188a6c9a2a8bd263f9.dblock.zip.aes

Jan 10, 2018 10:12 AM: Message
No remote filesets were deleted

Remote

Jan 10, 2018 10:14 AM: get duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes
Jan 10, 2018 10:14 AM: get duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes
Jan 10, 2018 10:14 AM: get duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes
Jan 10, 2018 10:14 AM: get duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes
Jan 10, 2018 10:14 AM: get duplicati-bf81e212c1ac346899ee5df4596c09c4b.dblock.zip.aes
Jan 10, 2018 10:14 AM: get duplicati-i7655205a00814fa7b5d2cd16152cb930.dindex.zip.aes
Jan 10, 2018 10:14 AM: get duplicati-20180105T174530Z.dlist.zip.aes
Jan 10, 2018 10:12 AM: list
Jan 10, 2018 10:12 AM: list

Since this error the next 3 executions had no error:

If you add --log-file=<path> and --log-level=profiling to your job then you should get a very detailed log saved to .

Well that’s good (it worked) and bad (it’s intermittent / hard to diagnose). Since nothing changed at your end I guess we’re left to assume hubic fixed something or there’s something occasionally going bad between them and you (aka flaky internet).

I think it has nothing to do with hubiC because I have 2 Jobs with this error and one of them hast hubiC as destination and the other one has GoogleDrive as destination!

So I will add this 2 parameters to both jobs and “normally” in 2 or 3 days one of them will have this error again.

Many successfull runs but now I’ve got the error again and I have the log file!

I am a little bit worried because the hashes that do not match are not the same hashes reported some days ago.

Duplicati log

Failed to process file duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes => Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-b5f08c2a-44b9-4cce-96cc-4cae553f9e2c", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=

I took the last about 300 lines of the log. After the jpg-Files were processed. Splitted in 2 posts because post was too long

Logfile part 1

2018-01-14 19:33:24Z - Profiling: BackupMainOperation took 00:00:36.178
2018-01-14 19:33:24Z - Profiling: Starting - FinalizeRemoteVolumes
2018-01-14 19:33:24Z - Profiling: FinalizeRemoteVolumes took 00:00:00.000
2018-01-14 19:33:24Z - Profiling: Starting - UpdateChangeStatistics
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteScalarInt64: SELECT "ID" FROM "Fileset" WHERE "Timestamp" < ? AND "ID" != ? ORDER BY "Timestamp" DESC 
2018-01-14 19:33:24Z - Profiling: ExecuteScalarInt64: SELECT "ID" FROM "Fileset" WHERE "Timestamp" < ? AND "ID" != ? ORDER BY "Timestamp" DESC  took 00:00:00.000
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ? AND "File"."BlocksetID" = ? AND NOT "File"."Path" IN (SELECT "Path" FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ?)
2018-01-14 19:33:24Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ? AND "File"."BlocksetID" = ? AND NOT "File"."Path" IN (SELECT "Path" FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ?) took 00:00:00.017
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ? AND "File"."BlocksetID" = ? AND NOT "File"."Path" IN (SELECT "Path" FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ?)
2018-01-14 19:33:24Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ? AND "File"."BlocksetID" = ? AND NOT "File"."Path" IN (SELECT "Path" FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ?) took 00:00:00.000
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ? AND "File"."BlocksetID" = ? AND NOT "File"."Path" IN (SELECT "Path" FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ?)
2018-01-14 19:33:24Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ? AND "File"."BlocksetID" = ? AND NOT "File"."Path" IN (SELECT "Path" FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ?) took 00:00:00.017
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ? AND "File"."BlocksetID" = ? AND NOT "File"."Path" IN (SELECT "Path" FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ?)
2018-01-14 19:33:24Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ? AND "File"."BlocksetID" = ? AND NOT "File"."Path" IN (SELECT "Path" FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ?) took 00:00:00.000
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM (SELECT "File"."Path", "Blockset"."Fullhash" FROM "File", "FilesetEntry", "Metadataset", "Blockset" WHERE "File"."ID" = "FilesetEntry"."FileID" AND "Metadataset"."ID" = "File"."MetadataID" AND "File"."BlocksetID" = ? AND "Metadataset"."BlocksetID" = "Blockset"."ID" AND "FilesetEntry"."FilesetID" = ? ) A, (SELECT "File"."Path", "Blockset"."Fullhash" FROM "File", "FilesetEntry", "Metadataset", "Blockset" WHERE "File"."ID" = "FilesetEntry"."FileID" AND "Metadataset"."ID" = "File"."MetadataID" AND "File"."BlocksetID" = ? AND "Metadataset"."BlocksetID" = "Blockset"."ID" AND "FilesetEntry"."FilesetID" = ? ) B WHERE "A"."Path" = "B"."Path" AND "A"."Fullhash" != "B"."Fullhash" 
2018-01-14 19:33:24Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM (SELECT "File"."Path", "Blockset"."Fullhash" FROM "File", "FilesetEntry", "Metadataset", "Blockset" WHERE "File"."ID" = "FilesetEntry"."FileID" AND "Metadataset"."ID" = "File"."MetadataID" AND "File"."BlocksetID" = ? AND "Metadataset"."BlocksetID" = "Blockset"."ID" AND "FilesetEntry"."FilesetID" = ? ) A, (SELECT "File"."Path", "Blockset"."Fullhash" FROM "File", "FilesetEntry", "Metadataset", "Blockset" WHERE "File"."ID" = "FilesetEntry"."FileID" AND "Metadataset"."ID" = "File"."MetadataID" AND "File"."BlocksetID" = ? AND "Metadataset"."BlocksetID" = "Blockset"."ID" AND "FilesetEntry"."FilesetID" = ? ) B WHERE "A"."Path" = "B"."Path" AND "A"."Fullhash" != "B"."Fullhash"  took 00:00:00.002
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM (SELECT "File"."Path", "Blockset"."Fullhash" FROM "File", "FilesetEntry", "Metadataset", "Blockset" WHERE "File"."ID" = "FilesetEntry"."FileID" AND "Metadataset"."ID" = "File"."MetadataID" AND "File"."BlocksetID" = ? AND "Metadataset"."BlocksetID" = "Blockset"."ID" AND "FilesetEntry"."FilesetID" = ? ) A, (SELECT "File"."Path", "Blockset"."Fullhash" FROM "File", "FilesetEntry", "Metadataset", "Blockset" WHERE "File"."ID" = "FilesetEntry"."FileID" AND "Metadataset"."ID" = "File"."MetadataID" AND "File"."BlocksetID" = ? AND "Metadataset"."BlocksetID" = "Blockset"."ID" AND "FilesetEntry"."FilesetID" = ? ) B WHERE "A"."Path" = "B"."Path" AND "A"."Fullhash" != "B"."Fullhash" 
2018-01-14 19:33:24Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM (SELECT "File"."Path", "Blockset"."Fullhash" FROM "File", "FilesetEntry", "Metadataset", "Blockset" WHERE "File"."ID" = "FilesetEntry"."FileID" AND "Metadataset"."ID" = "File"."MetadataID" AND "File"."BlocksetID" = ? AND "Metadataset"."BlocksetID" = "Blockset"."ID" AND "FilesetEntry"."FilesetID" = ? ) A, (SELECT "File"."Path", "Blockset"."Fullhash" FROM "File", "FilesetEntry", "Metadataset", "Blockset" WHERE "File"."ID" = "FilesetEntry"."FileID" AND "Metadataset"."ID" = "File"."MetadataID" AND "File"."BlocksetID" = ? AND "Metadataset"."BlocksetID" = "Blockset"."ID" AND "FilesetEntry"."FilesetID" = ? ) B WHERE "A"."Path" = "B"."Path" AND "A"."Fullhash" != "B"."Fullhash"  took 00:00:00.000
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C" AS SELECT "File"."Path", "A"."Fullhash" AS "Filehash", "B"."Fullhash" AS "Metahash" FROM "File", "FilesetEntry", "Blockset" A, "Blockset" B, "Metadataset"  WHERE "File"."ID" = "FilesetEntry"."FileID" AND "A"."ID" = "File"."BlocksetID" AND "FilesetEntry"."FilesetID" = ? AND "File"."MetadataID" = "Metadataset"."ID" AND "Metadataset"."BlocksetID" = "B"."ID" 
2018-01-14 19:33:24Z - Profiling: ExecuteNonQuery: CREATE TEMPORARY TABLE "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C" AS SELECT "File"."Path", "A"."Fullhash" AS "Filehash", "B"."Fullhash" AS "Metahash" FROM "File", "FilesetEntry", "Blockset" A, "Blockset" B, "Metadataset"  WHERE "File"."ID" = "FilesetEntry"."FileID" AND "A"."ID" = "File"."BlocksetID" AND "FilesetEntry"."FilesetID" = ? AND "File"."MetadataID" = "Metadataset"."ID" AND "Metadataset"."BlocksetID" = "B"."ID"  took 00:00:00.012
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "TmpFileList-26DF3BEE5AE7C746A3754AC827CEF146" AS SELECT "File"."Path", "A"."Fullhash" AS "Filehash", "B"."Fullhash" AS "Metahash" FROM "File", "FilesetEntry", "Blockset" A, "Blockset" B, "Metadataset"  WHERE "File"."ID" = "FilesetEntry"."FileID" AND "A"."ID" = "File"."BlocksetID" AND "FilesetEntry"."FilesetID" = ? AND "File"."MetadataID" = "Metadataset"."ID" AND "Metadataset"."BlocksetID" = "B"."ID" 
2018-01-14 19:33:24Z - Profiling: ExecuteNonQuery: CREATE TEMPORARY TABLE "TmpFileList-26DF3BEE5AE7C746A3754AC827CEF146" AS SELECT "File"."Path", "A"."Fullhash" AS "Filehash", "B"."Fullhash" AS "Metahash" FROM "File", "FilesetEntry", "Blockset" A, "Blockset" B, "Metadataset"  WHERE "File"."ID" = "FilesetEntry"."FileID" AND "A"."ID" = "File"."BlocksetID" AND "FilesetEntry"."FilesetID" = ? AND "File"."MetadataID" = "Metadataset"."ID" AND "Metadataset"."BlocksetID" = "B"."ID"  took 00:00:00.010
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ? AND "File"."BlocksetID" != ? AND "File"."BlocksetID" != ? AND NOT "File"."Path" IN (SELECT "Path" FROM "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C")
2018-01-14 19:33:24Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ? AND "File"."BlocksetID" != ? AND "File"."BlocksetID" != ? AND NOT "File"."Path" IN (SELECT "Path" FROM "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C") took 00:00:00.020
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C" WHERE "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C"."Path" NOT IN (SELECT "Path" FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ?)
2018-01-14 19:33:24Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C" WHERE "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C"."Path" NOT IN (SELECT "Path" FROM "File" INNER JOIN "FilesetEntry" ON "File"."ID" = "FilesetEntry"."FileID" WHERE "FilesetEntry"."FilesetID" = ?) took 00:00:00.020
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C" A, "TmpFileList-26DF3BEE5AE7C746A3754AC827CEF146" B WHERE "A"."Path" = "B"."Path" AND ("A"."Filehash" != "B"."Filehash" OR "A"."Metahash" != "B"."Metahash")
2018-01-14 19:33:24Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C" A, "TmpFileList-26DF3BEE5AE7C746A3754AC827CEF146" B WHERE "A"."Path" = "B"."Path" AND ("A"."Filehash" != "B"."Filehash" OR "A"."Metahash" != "B"."Metahash") took 00:00:00.027
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C";
2018-01-14 19:33:24Z - Profiling: ExecuteNonQuery: DROP TABLE IF EXISTS "TmpFileList-5E617F77F36AAC4091E2149D1ECE6C6C"; took 00:00:00.000
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "TmpFileList-26DF3BEE5AE7C746A3754AC827CEF146";
2018-01-14 19:33:24Z - Profiling: ExecuteNonQuery: DROP TABLE IF EXISTS "TmpFileList-26DF3BEE5AE7C746A3754AC827CEF146"; took 00:00:00.002
2018-01-14 19:33:24Z - Profiling: UpdateChangeStatistics took 00:00:00.135
2018-01-14 19:33:24Z - Profiling: Starting - VerifyConsistency
2018-01-14 19:33:24Z - Profiling: Starting - ExecuteReader: SELECT "CalcLen", "Length", "A"."BlocksetID", "File"."Path" FROM (
SELECT 
    "A"."ID" AS "BlocksetID", 
    IFNULL("B"."CalcLen", 0) AS "CalcLen", 
    "A"."Length"
FROM
    "Blockset" A
LEFT OUTER JOIN
    (
        SELECT 
            "BlocksetEntry"."BlocksetID",
            SUM("Block"."Size") AS "CalcLen"
        FROM
            "BlocksetEntry"
        LEFT OUTER JOIN
            "Block"
        ON
            "Block"."ID" = "BlocksetEntry"."BlockID"
        GROUP BY "BlocksetEntry"."BlocksetID"
    ) B
ON
    "A"."ID" = "B"."BlocksetID"

) A, "File" WHERE "A"."BlocksetID" = "File"."BlocksetID" AND "A"."CalcLen" != "A"."Length" 
2018-01-14 19:33:25Z - Profiling: ExecuteReader: SELECT "CalcLen", "Length", "A"."BlocksetID", "File"."Path" FROM (
SELECT 
    "A"."ID" AS "BlocksetID", 
    IFNULL("B"."CalcLen", 0) AS "CalcLen", 
    "A"."Length"
FROM
    "Blockset" A
LEFT OUTER JOIN
    (
        SELECT 
            "BlocksetEntry"."BlocksetID",
            SUM("Block"."Size") AS "CalcLen"
        FROM
            "BlocksetEntry"
        LEFT OUTER JOIN
            "Block"
        ON
            "Block"."ID" = "BlocksetEntry"."BlockID"
        GROUP BY "BlocksetEntry"."BlocksetID"
    ) B
ON
    "A"."ID" = "B"."BlocksetID"

) A, "File" WHERE "A"."BlocksetID" = "File"."BlocksetID" AND "A"."CalcLen" != "A"."Length"  took 00:00:00.187
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteScalarInt64: SELECT Count(*) FROM "BlocklistHash"
2018-01-14 19:33:25Z - Profiling: ExecuteScalarInt64: SELECT Count(*) FROM "BlocklistHash" took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteScalarInt64: SELECT Count(*) FROM (SELECT DISTINCT "BlocksetID", "Index" FROM "BlocklistHash")
2018-01-14 19:33:25Z - Profiling: ExecuteScalarInt64: SELECT Count(*) FROM (SELECT DISTINCT "BlocksetID", "Index" FROM "BlocklistHash") took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM (SELECT * FROM (SELECT "N"."BlocksetID", (("N"."BlockCount" + 3200 - 1) / 3200) AS "BlocklistHashCountExpected", CASE WHEN "G"."BlocklistHashCount" IS NULL THEN 0 ELSE "G"."BlocklistHashCount" END AS "BlocklistHashCountActual" FROM (SELECT "BlocksetID", COUNT(*) AS "BlockCount" FROM "BlocksetEntry" GROUP BY "BlocksetID") "N" LEFT OUTER JOIN (SELECT "BlocksetID", COUNT(*) AS "BlocklistHashCount" FROM "BlocklistHash" GROUP BY "BlocksetID") "G" ON "N"."BlocksetID" = "G"."BlocksetID" WHERE "N"."BlockCount" > 1) WHERE "BlocklistHashCountExpected" != "BlocklistHashCountActual")
2018-01-14 19:33:25Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM (SELECT * FROM (SELECT "N"."BlocksetID", (("N"."BlockCount" + 3200 - 1) / 3200) AS "BlocklistHashCountExpected", CASE WHEN "G"."BlocklistHashCount" IS NULL THEN 0 ELSE "G"."BlocklistHashCount" END AS "BlocklistHashCountActual" FROM (SELECT "BlocksetID", COUNT(*) AS "BlockCount" FROM "BlocksetEntry" GROUP BY "BlocksetID") "N" LEFT OUTER JOIN (SELECT "BlocksetID", COUNT(*) AS "BlocklistHashCount" FROM "BlocklistHash" GROUP BY "BlocksetID") "G" ON "N"."BlocksetID" = "G"."BlocksetID" WHERE "N"."BlockCount" > 1) WHERE "BlocklistHashCountExpected" != "BlocklistHashCountActual") took 00:00:00.045
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM "File" WHERE "BlocksetID" != ? AND "BlocksetID" != ? AND NOT "BlocksetID" IN (SELECT "BlocksetID" FROM "BlocksetEntry")
2018-01-14 19:33:25Z - Profiling: ExecuteScalarInt64: SELECT COUNT(*) FROM "File" WHERE "BlocksetID" != ? AND "BlocksetID" != ? AND NOT "BlocksetID" IN (SELECT "BlocksetID" FROM "BlocksetEntry") took 00:00:00.010
2018-01-14 19:33:25Z - Profiling: VerifyConsistency took 00:00:00.242
2018-01-14 19:33:25Z - Profiling: removing temp files, as no data needs to be uploaded
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: CREATE TEMP TABLE "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ("ID" INTEGER PRIMARY KEY)
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: CREATE TEMP TABLE "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ("ID" INTEGER PRIMARY KEY) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteScalarInt64: SELECT "ID" FROM "Remotevolume" WHERE "Name" = ?
2018-01-14 19:33:25Z - Profiling: ExecuteScalarInt64: SELECT "ID" FROM "Remotevolume" WHERE "Name" = ? took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "IndexBlockLink" WHERE "BlockVolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ) OR "IndexVolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" )
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "IndexBlockLink" WHERE "BlockVolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ) OR "IndexVolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "FilesetEntry" WHERE "FilesetID" IN (SELECT "ID" FROM "Fileset" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ))
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "FilesetEntry" WHERE "FilesetID" IN (SELECT "ID" FROM "Fileset" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" )) took 00:00:00.005
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Fileset" WHERE "VolumeID"  IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" )
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "Fileset" WHERE "VolumeID"  IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: CREATE TEMP TABLE "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" ("ID" INTEGER PRIMARY KEY)
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: CREATE TEMP TABLE "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" ("ID" INTEGER PRIMARY KEY) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: INSERT OR IGNORE INTO "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" ("ID") SELECT "BlocksetEntry"."BlocksetID" FROM "BlocksetEntry", "Block"  WHERE "BlocksetEntry"."BlockID" = "Block"."ID" AND "Block"."VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ) UNION ALL SELECT "BlocksetID" FROM "BlocklistHash" WHERE "Hash" IN (SELECT "Hash" FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ))
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: INSERT OR IGNORE INTO "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" ("ID") SELECT "BlocksetEntry"."BlocksetID" FROM "BlocksetEntry", "Block"  WHERE "BlocksetEntry"."BlockID" = "Block"."ID" AND "Block"."VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ) UNION ALL SELECT "BlocksetID" FROM "BlocklistHash" WHERE "Hash" IN (SELECT "Hash" FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" )) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "File" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" ) OR "MetadataID" IN (SELECT "ID" FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" )
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "File" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" ) OR "MetadataID" IN (SELECT "ID" FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" ) took 00:00:00.002
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Metadataset" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" )
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "Metadataset" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" ) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Blockset" WHERE "ID" IN (SELECT "ID" FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" )
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "Blockset" WHERE "ID" IN (SELECT "ID" FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" ) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "BlocksetEntry" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" )
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "BlocksetEntry" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" ) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "BlocklistHash" WHERE "Hash" IN (SELECT "Hash" FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ))
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "BlocklistHash" WHERE "Hash" IN (SELECT "Hash" FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" )) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" )
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "DeletedBlock" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" )
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "DeletedBlock" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" ) took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" 
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574"  took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" 
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "DelVolSetIds-BD91660E62E28E40B2D39D7741947574"  took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574" 
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DROP TABLE IF EXISTS "DelBlockSetIds-BD91660E62E28E40B2D39D7741947574"  took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "DelVolSetIds-BD91660E62E28E40B2D39D7741947574" 
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DROP TABLE IF EXISTS "DelVolSetIds-BD91660E62E28E40B2D39D7741947574"  took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - Async backend wait
2018-01-14 19:33:25Z - Profiling: Starting - RemoteOperationTerminate
2018-01-14 19:33:25Z - Profiling: RemoteOperationTerminate took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Async backend wait took 00:00:00.002
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2018-01-14 19:33:25Z - Profiling: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2018-01-14 19:33:25Z - Profiling: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2018-01-14 19:33:25Z - Profiling: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Fileset" WHERE "Timestamp" IN () 
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "Fileset" WHERE "Timestamp" IN ()  took 00:00:00.000
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "FilesetEntry" WHERE "FilesetID" NOT IN (SELECT DISTINCT "ID" FROM "Fileset")
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "FilesetEntry" WHERE "FilesetID" NOT IN (SELECT DISTINCT "ID" FROM "Fileset") took 00:00:00.002
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "File" WHERE "ID" NOT IN (SELECT DISTINCT "FileID" FROM "FilesetEntry") 

Logfile part 2

2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "File" WHERE "ID" NOT IN (SELECT DISTINCT "FileID" FROM "FilesetEntry")  took 00:00:00.005
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Metadataset" WHERE "ID" NOT IN (SELECT DISTINCT "MetadataID" FROM "File") 
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "Metadataset" WHERE "ID" NOT IN (SELECT DISTINCT "MetadataID" FROM "File")  took 00:00:00.005
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Blockset" WHERE "ID" NOT IN (SELECT DISTINCT "BlocksetID" FROM "File" UNION SELECT DISTINCT "BlocksetID" FROM "Metadataset") 
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "Blockset" WHERE "ID" NOT IN (SELECT DISTINCT "BlocksetID" FROM "File" UNION SELECT DISTINCT "BlocksetID" FROM "Metadataset")  took 00:00:00.012
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "BlocksetEntry" WHERE "BlocksetID" NOT IN (SELECT DISTINCT "ID" FROM "Blockset") 
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "BlocksetEntry" WHERE "BlocksetID" NOT IN (SELECT DISTINCT "ID" FROM "Blockset")  took 00:00:00.082
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "BlocklistHash" WHERE "BlocksetID" NOT IN (SELECT DISTINCT "ID" FROM "Blockset") 
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: DELETE FROM "BlocklistHash" WHERE "BlocksetID" NOT IN (SELECT DISTINCT "ID" FROM "Blockset")  took 00:00:00.010
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: INSERT INTO "DeletedBlock" ("Hash", "Size", "VolumeID") SELECT "Hash", "Size", "VolumeID" FROM "Block" WHERE "ID" NOT IN (SELECT DISTINCT "BlockID" AS "BlockID" FROM "BlocksetEntry" UNION SELECT DISTINCT "ID" FROM "Block", "BlocklistHash" WHERE "Block"."Hash" = "BlocklistHash"."Hash") 
2018-01-14 19:33:25Z - Profiling: ExecuteNonQuery: INSERT INTO "DeletedBlock" ("Hash", "Size", "VolumeID") SELECT "Hash", "Size", "VolumeID" FROM "Block" WHERE "ID" NOT IN (SELECT DISTINCT "BlockID" AS "BlockID" FROM "BlocksetEntry" UNION SELECT DISTINCT "ID" FROM "Block", "BlocklistHash" WHERE "Block"."Hash" = "BlocklistHash"."Hash")  took 00:00:00.540
2018-01-14 19:33:25Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Block" WHERE "ID" NOT IN (SELECT DISTINCT "BlockID" FROM "BlocksetEntry" UNION SELECT DISTINCT "ID" FROM "Block", "BlocklistHash" WHERE "Block"."Hash" = "BlocklistHash"."Hash") 
2018-01-14 19:33:26Z - Profiling: ExecuteNonQuery: DELETE FROM "Block" WHERE "ID" NOT IN (SELECT DISTINCT "BlockID" FROM "BlocksetEntry" UNION SELECT DISTINCT "ID" FROM "Block", "BlocklistHash" WHERE "Block"."Hash" = "BlocklistHash"."Hash")  took 00:00:00.485
2018-01-14 19:33:26Z - Profiling: Starting - ExecuteNonQuery: UPDATE "RemoteVolume" SET "State" = ? WHERE "Type" = ? AND "State" IN (?, ?) AND "ID" NOT IN (SELECT "VolumeID" FROM "Fileset") 
2018-01-14 19:33:26Z - Profiling: ExecuteNonQuery: UPDATE "RemoteVolume" SET "State" = ? WHERE "Type" = ? AND "State" IN (?, ?) AND "ID" NOT IN (SELECT "VolumeID" FROM "Fileset")  took 00:00:00.002
2018-01-14 19:33:26Z - Profiling: Starting - ExecuteReader: SELECT "Name", "Size" FROM "RemoteVolume" WHERE "Type" = ? AND "State" = ? 
2018-01-14 19:33:26Z - Profiling: ExecuteReader: SELECT "Name", "Size" FROM "RemoteVolume" WHERE "Type" = ? AND "State" = ?  took 00:00:00.000
2018-01-14 19:33:26Z - Profiling: Starting - RemoteOperationTerminate
2018-01-14 19:33:26Z - Profiling: RemoteOperationTerminate took 00:00:00.000
2018-01-14 19:33:26Z - Information: No remote filesets were deleted
2018-01-14 19:33:26Z - Profiling: Starting - CommitFinalizingBackup
2018-01-14 19:33:26Z - Profiling: CommitFinalizingBackup took 00:00:00.027
2018-01-14 19:33:26Z - Profiling: Starting - AfterBackupVerify
2018-01-14 19:33:26Z - Profiling: Starting - RemoteOperationList
2018-01-14 19:33:26Z - Information: Backend event: List - Started:  ()
2018-01-14 19:33:29Z - Information: Backend event: List - Completed:  (2.04 KB)
2018-01-14 19:33:29Z - Profiling: RemoteOperationList took 00:00:03.140
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting")
2018-01-14 19:33:29Z - Profiling: ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting") took 00:00:00.002
2018-01-14 19:33:29Z - Information: removing file listed as Temporary: duplicati-b1e0eb31725f44abd9ee949879b9dda03.dblock.zip.aes
2018-01-14 19:33:29Z - Information: removing file listed as Temporary: duplicati-i3ea045afd0374260b29ecfeb40ebfa86.dindex.zip.aes
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: CREATE TEMP TABLE "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ("ID" INTEGER PRIMARY KEY)
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: CREATE TEMP TABLE "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ("ID" INTEGER PRIMARY KEY) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteScalarInt64: SELECT "ID" FROM "Remotevolume" WHERE "Name" = ?
2018-01-14 19:33:29Z - Profiling: ExecuteScalarInt64: SELECT "ID" FROM "Remotevolume" WHERE "Name" = ? took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteScalarInt64: SELECT "ID" FROM "Remotevolume" WHERE "Name" = ?
2018-01-14 19:33:29Z - Profiling: ExecuteScalarInt64: SELECT "ID" FROM "Remotevolume" WHERE "Name" = ? took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "IndexBlockLink" WHERE "BlockVolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) OR "IndexVolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "IndexBlockLink" WHERE "BlockVolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) OR "IndexVolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) took 00:00:00.002
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "FilesetEntry" WHERE "FilesetID" IN (SELECT "ID" FROM "Fileset" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ))
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "FilesetEntry" WHERE "FilesetID" IN (SELECT "ID" FROM "Fileset" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Fileset" WHERE "VolumeID"  IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "Fileset" WHERE "VolumeID"  IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: CREATE TEMP TABLE "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ("ID" INTEGER PRIMARY KEY)
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: CREATE TEMP TABLE "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ("ID" INTEGER PRIMARY KEY) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: INSERT OR IGNORE INTO "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ("ID") SELECT "BlocksetEntry"."BlocksetID" FROM "BlocksetEntry", "Block"  WHERE "BlocksetEntry"."BlockID" = "Block"."ID" AND "Block"."VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) UNION ALL SELECT "BlocksetID" FROM "BlocklistHash" WHERE "Hash" IN (SELECT "Hash" FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ))
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: INSERT OR IGNORE INTO "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ("ID") SELECT "BlocksetEntry"."BlocksetID" FROM "BlocksetEntry", "Block"  WHERE "BlocksetEntry"."BlockID" = "Block"."ID" AND "Block"."VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) UNION ALL SELECT "BlocksetID" FROM "BlocklistHash" WHERE "Hash" IN (SELECT "Hash" FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )) took 00:00:00.002
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "File" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) OR "MetadataID" IN (SELECT "ID" FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "File" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) OR "MetadataID" IN (SELECT "ID" FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) took 00:00:00.002
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Metadataset" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "Metadataset" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Blockset" WHERE "ID" IN (SELECT "ID" FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "Blockset" WHERE "ID" IN (SELECT "ID" FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "BlocksetEntry" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "BlocksetEntry" WHERE "BlocksetID" IN (SELECT "ID" FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "BlocklistHash" WHERE "Hash" IN (SELECT "Hash" FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ))
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "BlocklistHash" WHERE "Hash" IN (SELECT "Hash" FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )) took 00:00:00.002
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "Block" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "DeletedBlock" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" )
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "DeletedBlock" WHERE "VolumeID" IN (SELECT "ID" FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" ) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" 
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26"  took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" 
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DELETE FROM "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26"  took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26" 
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DROP TABLE IF EXISTS "DelBlockSetIds-05A4FEFC34AF1448B3D8F479C7504D26"  took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26" 
2018-01-14 19:33:29Z - Profiling: ExecuteNonQuery: DROP TABLE IF EXISTS "DelVolSetIds-05A4FEFC34AF1448B3D8F479C7504D26"  took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: AfterBackupVerify took 00:00:03.460
2018-01-14 19:33:29Z - Profiling: Starting - RemoteOperationTerminate
2018-01-14 19:33:29Z - Profiling: RemoteOperationTerminate took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2018-01-14 19:33:29Z - Profiling: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteReader: SELECT "ID", "Name", "Size", "Hash", "VerificationCount" FROM "Remotevolume" WHERE ("State" = ? OR "State" = ?) AND ("Hash" = "" OR "Hash" IS NULL OR "Size" <= 0) 
2018-01-14 19:33:29Z - Profiling: ExecuteReader: SELECT "ID", "Name", "Size", "Hash", "VerificationCount" FROM "Remotevolume" WHERE ("State" = ? OR "State" = ?) AND ("Hash" = "" OR "Hash" IS NULL OR "Size" <= 0)  took 00:00:00.002
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteScalarInt64: SELECT MAX("VerificationCount") FROM "RemoteVolume"
2018-01-14 19:33:29Z - Profiling: ExecuteScalarInt64: SELECT MAX("VerificationCount") FROM "RemoteVolume" took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteReader: SELECT "A"."VolumeID", "A"."Name", "A"."Size", "A"."Hash", "A"."VerificationCount" FROM (SELECT "ID" AS "VolumeID", "Name", "Size", "Hash", "VerificationCount" FROM "Remotevolume" WHERE "State" IN (?, ?)) A, "Fileset"  WHERE  "A"."VolumeID" = "Fileset"."VolumeID" ORDER BY "Fileset"."Timestamp" 
2018-01-14 19:33:29Z - Profiling: ExecuteReader: SELECT "A"."VolumeID", "A"."Name", "A"."Size", "A"."Hash", "A"."VerificationCount" FROM (SELECT "ID" AS "VolumeID", "Name", "Size", "Hash", "VerificationCount" FROM "Remotevolume" WHERE "State" IN (?, ?)) A, "Fileset"  WHERE  "A"."VolumeID" = "Fileset"."VolumeID" ORDER BY "Fileset"."Timestamp"  took 00:00:00.002
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteReader: SELECT "ID", "Name", "Size", "Hash", "VerificationCount" FROM "Remotevolume" WHERE "Type" = ? AND "State" IN (?, ?)
2018-01-14 19:33:29Z - Profiling: ExecuteReader: SELECT "ID", "Name", "Size", "Hash", "VerificationCount" FROM "Remotevolume" WHERE "Type" = ? AND "State" IN (?, ?) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - ExecuteReader: SELECT "ID", "Name", "Size", "Hash", "VerificationCount" FROM "Remotevolume" WHERE "Type" = ? AND "State" IN (?, ?)
2018-01-14 19:33:29Z - Profiling: ExecuteReader: SELECT "ID", "Name", "Size", "Hash", "VerificationCount" FROM "Remotevolume" WHERE "Type" = ? AND "State" IN (?, ?) took 00:00:00.000
2018-01-14 19:33:29Z - Profiling: Starting - RemoteOperationGet
2018-01-14 19:33:29Z - Information: Backend event: Get - Started: duplicati-20180105T174233Z.dlist.zip.aes (1.34 KB)
2018-01-14 19:33:30Z - Profiling: Downloaded 1.34 KB in 00:00:00.9325143, 1.44 KB/s
2018-01-14 19:33:30Z - Information: Backend event: Get - Completed: duplicati-20180105T174233Z.dlist.zip.aes (1.34 KB)
2018-01-14 19:33:30Z - Profiling: RemoteOperationGet took 00:00:00.932
2018-01-14 19:33:30Z - Profiling: Starting - ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = ?
2018-01-14 19:33:30Z - Profiling: ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = ? took 00:00:00.040
2018-01-14 19:33:30Z - Profiling: Starting - RemoteOperationGet
2018-01-14 19:33:30Z - Information: Backend event: Get - Started: duplicati-i3b981ea5d8de4d55afc9e5f150e586b4.dindex.zip.aes (12.20 KB)
2018-01-14 19:33:31Z - Profiling: Downloaded 12.20 KB in 00:00:00.3725057, 32.75 KB/s
2018-01-14 19:33:31Z - Information: Backend event: Get - Completed: duplicati-i3b981ea5d8de4d55afc9e5f150e586b4.dindex.zip.aes (12.20 KB)
2018-01-14 19:33:31Z - Profiling: RemoteOperationGet took 00:00:00.372
2018-01-14 19:33:31Z - Profiling: Starting - ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = ?
2018-01-14 19:33:31Z - Profiling: ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = ? took 00:00:00.062
2018-01-14 19:33:31Z - Profiling: Starting - RemoteOperationGet
2018-01-14 19:33:31Z - Information: Backend event: Get - Started: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:34:08Z - Profiling: Downloaded 14.91 MB in 00:00:37.7330780, 404.60 KB/s
2018-01-14 19:34:08Z - Information: Backend event: Get - Completed: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:34:08Z - Profiling: RemoteOperationGet took 00:00:37.738
2018-01-14 19:34:08Z - Warning: Operation Get with file duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes attempt 1 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-e32ca024-c0d7-4791-8f0d-d44044d1da01", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-e32ca024-c0d7-4791-8f0d-d44044d1da01", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()

2018-01-14 19:34:08Z - Information: Backend event: Get - Retrying: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:34:18Z - Profiling: Starting - RemoteOperationGet
2018-01-14 19:34:18Z - Information: Backend event: Get - Started: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:34:54Z - Profiling: Downloaded 14.91 MB in 00:00:35.5030439, 430.01 KB/s
2018-01-14 19:34:54Z - Information: Backend event: Get - Completed: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:34:54Z - Profiling: RemoteOperationGet took 00:00:35.508
2018-01-14 19:34:54Z - Warning: Operation Get with file duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes attempt 2 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-3565b2da-59aa-4328-9271-6255d513b987", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-3565b2da-59aa-4328-9271-6255d513b987", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()

2018-01-14 19:34:54Z - Information: Backend event: Get - Retrying: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:35:04Z - Profiling: Starting - RemoteOperationGet
2018-01-14 19:35:04Z - Information: Backend event: Get - Started: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:35:39Z - Profiling: Downloaded 14.91 MB in 00:00:34.6180304, 441.01 KB/s
2018-01-14 19:35:39Z - Information: Backend event: Get - Completed: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:35:39Z - Profiling: RemoteOperationGet took 00:00:34.623
2018-01-14 19:35:39Z - Warning: Operation Get with file duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes attempt 3 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-91231ec7-82a4-4bb2-bc0e-f1be1aa54ebb", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-91231ec7-82a4-4bb2-bc0e-f1be1aa54ebb", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()

2018-01-14 19:35:39Z - Information: Backend event: Get - Retrying: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:35:49Z - Profiling: Starting - RemoteOperationGet
2018-01-14 19:35:49Z - Information: Backend event: Get - Started: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:36:23Z - Profiling: Downloaded 14.91 MB in 00:00:34.3580263, 444.34 KB/s
2018-01-14 19:36:23Z - Information: Backend event: Get - Completed: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:36:23Z - Profiling: RemoteOperationGet took 00:00:34.363
2018-01-14 19:36:23Z - Warning: Operation Get with file duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes attempt 4 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-85ac0be0-87ff-48c2-800f-e4955d417d30", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-85ac0be0-87ff-48c2-800f-e4955d417d30", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()

2018-01-14 19:36:23Z - Information: Backend event: Get - Retrying: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:36:33Z - Profiling: Starting - RemoteOperationGet
2018-01-14 19:36:33Z - Information: Backend event: Get - Started: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:37:07Z - Profiling: Downloaded 14.91 MB in 00:00:33.8205181, 451.41 KB/s
2018-01-14 19:37:07Z - Information: Backend event: Get - Completed: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:37:07Z - Profiling: RemoteOperationGet took 00:00:33.825
2018-01-14 19:37:07Z - Warning: Operation Get with file duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes attempt 5 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-b5f08c2a-44b9-4cce-96cc-4cae553f9e2c", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-b5f08c2a-44b9-4cce-96cc-4cae553f9e2c", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()

2018-01-14 19:37:07Z - Information: Backend event: Get - Failed: duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes (14.91 MB)
2018-01-14 19:37:07Z - Error: Failed to process file duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-b5f08c2a-44b9-4cce-96cc-4cae553f9e2c", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=
   at Duplicati.Library.Main.BackendManager.GetForTesting(String remotename, Int64 size, String hash)
   at Duplicati.Library.Main.Operation.TestHandler.DoRun(Int64 samples, LocalTestDatabase db, BackendManager backend)

2018-01-14 19:37:07Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "LogData" WHERE "Timestamp" < ?
2018-01-14 19:37:07Z - Profiling: ExecuteNonQuery: DELETE FROM "LogData" WHERE "Timestamp" < ? took 00:00:00.000
2018-01-14 19:37:07Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "RemoteOperation" WHERE "Timestamp" < ?
2018-01-14 19:37:07Z - Profiling: ExecuteNonQuery: DELETE FROM "RemoteOperation" WHERE "Timestamp" < ? took 00:00:00.002
2018-01-14 19:37:07Z - Profiling: Starting - ExecuteNonQuery: VACUUM
2018-01-14 19:37:10Z - Profiling: ExecuteNonQuery: VACUUM took 00:00:02.767
2018-01-14 19:37:10Z - Profiling: Running Backup took 00:04:26.679

Sorry to hear the errors are back - but it’s good that you have the logs!

So at 19:34:08Z we can see the first of 5 downloads of a dblock (archive volume) file. Each attempt results a different temp file name (as expected) as well as the same error of the same download file hash not matching the same database expected file hash.

2018-01-14 19:34:08Z - Warning: Operation Get with file duplicati-b610f7bfd35c24b98883dc14a751c223b.dblock.zip.aes attempt 1 of 5 failed with message: Hash mismatch on file "C:\Users\server\AppData\Local\Temp\dup-e32ca024-c0d7-4791-8f0d-d44044d1da01", recorded hash: RgzZaMU3n4QmMLZfFeEq/82tYaBCbuqwJghP/a9o5II=, actual hash KYfqN3xHnHDKb/pPGvBfD6ripcYAoXkDaln9AN1uMxA=

As I think I mentioned before, the file SIZE is correct in all 5 downloads, otherwise we would have seen a file size error instead of a hash error. So we’re still stuck with the question of is the database correct or the actual file.

I’d have poke around but I believe there is a parameter that tells Duplicati to not do the hash test (or to ignore the results) which would allow you to do a restore regardless of the error at which point you could manually verify the resulting files.

Unfortunately, I still have no idea what is causing this issue in the first place. I think @kenkendk is the expert on the database (and hashing) stuff so hopefully he’ll get a chance to see this and have some ideas.

Also for me the “hash mismatch” error is back and I don’t know how to handle it. Last time I just created a new backup set but that could not be the final solution!?

I searched in the databases (table LogData) for “hash errors” and something is strange.

It seems to be a difference what kind of target is used!

  • Target UNC, local LAN, wired (8 jobs, about 32 executions per day) Hash mismatch: 0 times!!!

  • Target local HDD (6 jobs, about 24 executions per day) Hash mismatch: 0 times!!!

  • Target SSH over Internet to a privat machine (1 job, 24 executions per day) Hash mismatch: 1 times and that was just “attempt 1 of 5”!

  • Target Google (1 job, 20 executions per day) Hash mismatch: 0 times!!!

  • Target meganz (7 jobs, about 20 executions per day) Hash mismatch: 1 times and that was just “attempt 1 of 5”!

--------------------------------------------------------------------------------
Starting here are the “problem” targets

  • Target Dropbox (2 jobs, 12 executions per day) Hash mismatch: 347 times 347/5 attemps = 70 failures => 1 failure per 5 runs

  • Target webdav boxcom (1 job, 10 executions per day) Hash mismatch: 101 times
    101/5 attemps = 20 failures => 1 failure per 15 runs

  • Target webdav terracloud (1 job, 1 execution per day) Hash mismatch: 10 times
    10/5 attemps = 2 failures => 1 failure per 15 runs

  • Target hubiC (2 jobs, about 8 executions per day) Hash mismatch: 40 times
    40/5 attemps = 8 failures => 1 failure per 30 runs

So it seems that there is a big problem with different hosters / different access methods. No problem with meganz, no problem with SSH and no problem with local LAN or local disc.

But many problems with webdav boxcom, webdav teracloud and hubic.

I can’t find this commandline option. Do you know the name of it?

I have 3 linux systems running duplicati and all of them are backing up to webdav on a qnap nas. All of them have more than one backup sets defined. But only the home folder on my desktop machine is getting this hash mismatch error.

While it shouldn’t be considered a fix (we really need to try and figure out why it’s happening in the first place) see if this helps at all:

--skip-file-hash-checks (default false)
If the hash for the volume does not match, Duplicati will refuse to use the backup. Supply this flag to allow Duplicati to proceed anyway.

Hello, I think I find workaround.

I had similar problem:
Errors: [ Failed to process file duplicati-20180221T192419Z.dlist.zip.aes => Hash mismatch on file "d:\Duplicati2_TEMP\dup-2cce5e16-d8f3-4416-ae21-cced08df2e4d", recorded hash: jm9haTFVB2As8Fg4rv9EwMBfsUbzEtDyap+CqYfmRXw=, actual hash 64D7qPZTijzv9qgmnOumGmPYPE74tzRxlFBto0h19bI=

I tried backup, verify, repair, upgrading, deleting all temp files… nothing helped…

So I run DuplicatiVerify.ps1 from C:\Program Files\Duplicati 2\utility-scripts
And after couple minutes I saw:
*** Hash check failed for file: duplicati-20180221T192419Z.dlist.zip.aes

So I renamed that file. Run backup. Duplicati told me to run repair so I did. And after that error message was gone.