Please verify the sha256 hash?

I believe this is the size stored on the local database. When Duplicati creates an archive file it stores the file size in the database and then uses it as a quick check that the uploaded file size didn’t change (such as due to an internet transmission issue).

So in this case the size of the destination file seems to have changed since it was created. If it’s a recent file, it’s possible something happened during the initial upload. If it’s an older file then something may have happened at the destination.

I’m either case, I think what needs to happen is for the archive file to be downloaded and verified. If it’s a valid archive I expect Duplicati would update the local database size.

Unfortunately, I don’t know how to make that happen - but @kenkendk, if not others, certainly should.

Interesting. I tried to delete and recreate the database several times and I still get the error. Then I found the “repair” option under Command Line, which gives me:

Listing remote folder ...
remote file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes is listed as Verified with size 37421056 but should be 47521821, please verify the sha256 hash "o83q16QqxPVeE7K/PIJpOH0tpn+/JczT3fvAn+K4Mds="
  Downloading file (unknown) ...
Operation Get with file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes attempt 1 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (unknown) ...
Operation Get with file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes attempt 2 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (unknown) ...
Operation Get with file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes attempt 3 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (unknown) ...
Operation Get with file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes attempt 4 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (unknown) ...
Operation Get with file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes attempt 5 of 5 failed with message: File length is invalid => File length is invalid
Failed to perform verification for file: duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes, please run verify; message: File length is invalid => File length is invalid

At this point I’ve given up on the old backup set. I’ve moved it to an old/ directory and I’m just starting a new backup. Probably should have done that all along, but for anyone not using local storage it would be a pain to have to re-upload many GB of backup sets (I have other offsite backups).

Replying to myself, after moving the old backup files, deleting the database, and re-running the backup job, I’m back to hash mismatch errors. I’m starting to wonder if this is an issue with the canary build; I’ll try going back to the beta from before and see if anything changes.

  Failed to process file duplicati-b4e99860187e14225b262ca87c065dc1c.dblock.zip.aes => Hash mismatch on file "/tmp/dup-0595915d-c50c-4ce8-9594-43629189571a", recorded hash: oxOTds52+6CjHn0mc7UZ58KbzfhS5lZ1xW9gmewaHuw=, actual hash 3qvqe+jOVP0ktD83p/Mor8UbfasehAKOZa5iA1OVYSQ=
]

I don’t recall hearing about it from others, but then it may be you’re using a destination or setting other aren’t.

If you’ve for the time then the beta rollback sound like a great way to help isolate the issue.

Hi,

I newly moved to duplicati from crashplan but after the first sftp backup I have got the same issue. I have made a backup of ~300 GB of data to a sftp server but during the backup my computer rebooted due to a windows update. I don’t know if that has something to do with this but i think i had to rebuild the database afterwards. However, I have verified the size of the file by using WinSCP to connect to the server and also directly in windows on the host machine and it is 44493660 as stated in the warning messages. SHA-256 is 2d8582625abd5ad4a1d81c1bb772f39d243434080024e1de683b55df13aff5ac according to sftp server and WinSCP. That’s not the same as in the warning messages.

The log:

DeletedFolders: 0
ModifiedFiles: 2
ExaminedFiles: 42400
OpenedFiles: 2
AddedFiles: 0
SizeOfModifiedFiles: 0
SizeOfAddedFiles: 0
SizeOfExaminedFiles: 374410675997
SizeOfOpenedFiles: 11098
NotProcessedFiles: 0
AddedFolders: 0
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
CompactResults:
    DeletedFileCount: 278
    DownloadedFileCount: 0
    UploadedFileCount: 0
    DeletedFileSize: 7285059022
    DownloadedFileSize: 0
    UploadedFileSize: 0
    Dryrun: False
    MainOperation: Compact
    ParsedResult: Success
    EndTime: 2018-01-03 07:04:33
    BeginTime: 2018-01-03 07:04:07
    Duration: 00:00:25.4041760
    BackendStatistics:
        RemoteCalls: 286
        BytesUploaded: 4842823
        BytesDownloaded: 57254007
        FilesUploaded: 3
        FilesDownloaded: 3
        FilesDeleted: 278
        FoldersCreated: 0
        RetryAttempts: 0
        UnknownFileSize: 0
        UnknownFileCount: 0
        KnownFileCount: 13225
        KnownFileSize: 346541677172
        LastBackupDate: 2018-01-03 08:00:52
        BackupListCount: 3
        TotalQuotaSpace: 0
        FreeQuotaSpace: 0
        AssignedQuotaSpace: -1
        ParsedResult: Success
DeleteResults:
    DeletedSets: []
    Dryrun: False
    MainOperation: Delete
    ParsedResult: Success
    EndTime: 2018-01-03 07:04:33
    BeginTime: 2018-01-03 07:03:52
    Duration: 00:00:40.8690253
RepairResults: null
TestResults:
    MainOperation: Test
    Verifications: [
        Key: duplicati-20180103T070052Z.dlist.zip.aes
        Value: [],
        Key: duplicati-ibe3266c190ef4ce8825c9889024f89e8.dindex.zip.aes
        Value: [],
        Key: duplicati-bdd997f12e438448cb7242f4df0c42055.dblock.zip.aes
        Value: []
    ]
    ParsedResult: Success
    EndTime: 2018-01-03 07:05:58
    BeginTime: 2018-01-03 07:04:40
    Duration: 00:01:18.0469487
ParsedResult: Warning
EndTime: 2018-01-03 07:05:58
BeginTime: 2018-01-03 07:00:52
Duration: 00:05:05.7192388
Messages: [
    No remote filesets were deleted,
    Compacting because there are 139 fully deletable volume(s),
    Deleted 278 files, which reduced storage by 6,78 GB,
    removing file listed as Temporary: duplicati-b80e72fb2a3834a4da17616097735c630.dblock.zip.aes,
    removing file listed as Temporary: duplicati-ief8fb87b586e40fe9d9f47414c5d8003.dindex.zip.aes
]
Warnings: [
    remote file duplicati-b1e57936bafbd4f25a0678cf6c5d0303b.dblock.zip.aes is listed as Verified with size 44493660 but should be 52427501, please verify the sha256 hash "6X37MDa+Mw2uGm/P569ExKKiN3dIIW69LDbndprv9hY=",
    remote file duplicati-b1e57936bafbd4f25a0678cf6c5d0303b.dblock.zip.aes is listed as Verified with size 44493660 but should be 52427501, please verify the sha256 hash "6X37MDa+Mw2uGm/P569ExKKiN3dIIW69LDbndprv9hY="
]
Errors: []

How should I proceed to solve this? I have tried to restore a file that that worked but I got the same warning.

Most likely the remote file is broken, since it is missing some data.

I suggest you remove the file and then run the purge-broken-files command from the “Commandline” area.

I realize this would be a decent chunk of work but would it make sense to have the GUI provide a way to let Duplicati remove the remote file rather than needed a user to do it (and risk killing the wrong file)?

An added benefit could be a notice of what files/versions would be affected by the loss of that remote file…

Thanks that solved my problem!

Yes, it simply requires that I (or someone else) provides the UI for “purge-broken-files”, which I think we should do anyway.

Glad to hear you got things working again!

Let me know if I got it wrong, but I went ahead and flagged the post to which I think you were referring when you said your problem was solved. :slight_smile:

Is duplicati not supported backing up to a NAS mounted by CIFS? I went back to the beta version, removed all the old backup files from the destination, recreated the database, and started backing up from scratch and I’m still getting hundreds of these errors on one job. It appears every file is failing with the “please verify the sha256 hash” error now.

I’m thinking duplicati doesn’t like something in my new configuration but I don’t understand what to do. If the remote file is a different size than expected, can duplicati reconcile the difference and just say “OK now the file is X size, it’s fine”?

Edit - I just looked and 3 of my backup jobs are having this problem. The only one that is not is a small DAVFS job. The destination is the same, but the source is DAVFS instead of CIFS. But the error messages seem to imply that the problem is on the storage/destination side, not the source side, so I don’t know if that means anything that one job is working (maybe just because it’s much smaller than the others).

Edit 2 - check that - another job that is failing is smaller than the job that is working, so it doesn’t appear to be a backup size issue.

By the way JonMikeIV I really appreciate your input on all my recent threads as I try to get Duplicati working but this issue is not solved - you were replying to Kenny in this thread, and he apparently solved his issue, but I still have not managed to get rid of this error.

To do what Kenny did, does that mean I have to delete the entire remote backup archive (aes files) and run purge-broken-files? Does that mean it will wipe out all my backup for this job and start over?

Deleting the aes files would indeed be starting from scratch. You’d be better off deleting the job - or at least creating (export/import?) a new one to a different folder on your destination.

I’m a little confused - are the sources DAVFS and CIFS (so they’re not local to the machine running Duplicati) or the destinations (or both)?

It is supposed to work. The error messages seem to indicate that the files are somehow truncated after being stored.

Duplicati basically does a file copy from the temporary folder to the destination, so I would assume that you could replicate a similar problem if you copied a file to the CIFS destination manually?

What I am doing is running Duplicati on an Ubuntu server. It mounts various shares via CIFS and the backup destination is a NAS on the same network.

What I’ve done is recreate the backup jobs from scratch (not restoring from the config files from the previous backup setup), with entirely new destinations, and so far I am not seeing the sha256 hash error yet. The old jobs are still running as well and I am still getting the errors on those jobs. I will run the new jobs concurrently for a few days and see if the errors return on the new jobs.

Edit: still broken on my machine, see later post.

I’m going to call this one solved, though I don’t know exactly how. Recreating all the backup jobs from scratch seems to have fixed it; it has now been several days and I haven’t seen any errors in the logs.

Well the error is back again, just took a few more days, so it’s unfortunately not solved, but I’ll update this thread if I find out anything more.

Is it the same file(s) reported each time or do they change?

I just re-ran the backup and between yesterday and today, the same 38 files appear to have the sha256 error.

“remote file duplicati-b19edfe0fbf7c4ca49e09c569e6824fac.dblock.zip.aes is listed as Verified with size 10934076 but should be 52406685, please verify the sha256 hash “sKqBc3FLBHZ1b0FDO0t0xuNG1jIyXZrON0zsKAC0aFQ=”,”

It appears that this file is in fact 10MB (so the verified size matches the actual file size on the NAS), but it was modified 1/14 during a backup operation that reported no warnings or errors.

I guess I will try moving the destination back to the local machine again to see if I can isolate this to the remote storage.

In your backup job is your “Volume size” set to 10MB or the default 50MB?

Are all the reported files (or is that ALL the files) the same 10MB size?

I’m wondering if something is capping / truncating you supposed-to-be 50MB files to 10MB for some treason…

1 Like