Please verify the sha256 hash?

Edit: I think I’m having filesystem errors I’m getting a read only notice on the remote share now, so that must be why the remote files can’t be fixed. I can’t seem to delete this post right now but if I figure this out I’ll write up what happened.

Can you verify whether the read-only state is due to the file permissions, the share permissions, or the account being used to access the share?

While I’ve never run into the “please verify the SHA256 hash” message before another user did here:

However, it’s unlikely a read-only scenario would cause this message. It might be worth “manually” checking one or two file sizes on the destination to see what they are shown as. This could possibly narrow down whether there’s an actual file size issue (maybe filesystem issue as you said) or just a REPORTED one (maybe your destination share / FTP / SFTPs / etc. server has a problem reporting the real file size).

Thank you JonMikeIV. I’m not certain because of 3 backup jobs, one exhibited this error, another one seemed like it transferred over perfectly, and another says I deleted and re-added all 64K files:

DeletedFiles: 64598
DeletedFolders: 5536
ModifiedFiles: 0
ExaminedFiles: 64611
OpenedFiles: 64611
AddedFiles: 64562

For the one that had the hash error, I just ended up deleting the backup set and re-running the backup (I didn’t need archives in this case) and it’s fine.

Now I’m having another issue about a hash mismatch on another job. It seems I’m just going to save my old backups in case I need to restore and restart these jobs from scratch; I probably did something wrong reinstalling or exporting and importing the jobs over to the new machine.

Edit: I also moved from beta to canary, so that could be causing these issues as well I suppose.

Did you happen to move or rename a folder (might even be case sensitive depending on OS)?

Duplicati would see that as a delete followed by an add of the exact same content. It would cause a lot of sqlite database processing but very little actual destination uploads due to de-duplication.

Unfortunately, I don’t know what would be causing these hash mismatches - hopefully @kenkendk or somebody better versed in the hashing side can offer some insight.

Ah, interesting. While I didn’t change anything in the source machine, I did change the mount point of the source drive on the server running Duplicati. It did take a very long time to do that initial backup but yes, all my *.aes files have their older modification dates going back months. Now that it’s done that first one the backups are lightning fast (Kaby Lake Core i5). This must have happened for all the jobs, but I only noticed it on this one because the others are much smaller backup jobs. I’ll go back to the logs and check.

Thank you for the insight into what happened there, I really appreciate that.

1 Like

I’m getting this error on another job now:

Warnings: [
remote file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes is listed as Verified with size 37421056 but should be 47521821, please verify the sha256 hash “o83q16QqxPVeE7K/PIJpOH0tpn+/JczT3fvAn+K4Mds=”

Is this serious?

This was on a verify job. I get 2 similar errors during a backup.

Edit: the file as reported from the machine running Duplicati is 37,421,056 bytes (matches the verified size above):

-rwxr-xr-x 1 root root 37421056 Dec 19 10:17 duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes

So I don’t know where it is getting 47,218,821 bytes from (the size it ‘should be’?). I’ve just re-run the backup and I get the same error.

I believe this is the size stored on the local database. When Duplicati creates an archive file it stores the file size in the database and then uses it as a quick check that the uploaded file size didn’t change (such as due to an internet transmission issue).

So in this case the size of the destination file seems to have changed since it was created. If it’s a recent file, it’s possible something happened during the initial upload. If it’s an older file then something may have happened at the destination.

I’m either case, I think what needs to happen is for the archive file to be downloaded and verified. If it’s a valid archive I expect Duplicati would update the local database size.

Unfortunately, I don’t know how to make that happen - but @kenkendk, if not others, certainly should.

Interesting. I tried to delete and recreate the database several times and I still get the error. Then I found the “repair” option under Command Line, which gives me:

Listing remote folder ...
remote file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes is listed as Verified with size 37421056 but should be 47521821, please verify the sha256 hash "o83q16QqxPVeE7K/PIJpOH0tpn+/JczT3fvAn+K4Mds="
  Downloading file (unknown) ...
Operation Get with file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes attempt 1 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (unknown) ...
Operation Get with file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes attempt 2 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (unknown) ...
Operation Get with file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes attempt 3 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (unknown) ...
Operation Get with file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes attempt 4 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (unknown) ...
Operation Get with file duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes attempt 5 of 5 failed with message: File length is invalid => File length is invalid
Failed to perform verification for file: duplicati-b2d9a079f8fcf4e32801fc4fdfe50aef6.dblock.zip.aes, please run verify; message: File length is invalid => File length is invalid

At this point I’ve given up on the old backup set. I’ve moved it to an old/ directory and I’m just starting a new backup. Probably should have done that all along, but for anyone not using local storage it would be a pain to have to re-upload many GB of backup sets (I have other offsite backups).

Replying to myself, after moving the old backup files, deleting the database, and re-running the backup job, I’m back to hash mismatch errors. I’m starting to wonder if this is an issue with the canary build; I’ll try going back to the beta from before and see if anything changes.

  Failed to process file duplicati-b4e99860187e14225b262ca87c065dc1c.dblock.zip.aes => Hash mismatch on file "/tmp/dup-0595915d-c50c-4ce8-9594-43629189571a", recorded hash: oxOTds52+6CjHn0mc7UZ58KbzfhS5lZ1xW9gmewaHuw=, actual hash 3qvqe+jOVP0ktD83p/Mor8UbfasehAKOZa5iA1OVYSQ=
]

I don’t recall hearing about it from others, but then it may be you’re using a destination or setting other aren’t.

If you’ve for the time then the beta rollback sound like a great way to help isolate the issue.

Hi,

I newly moved to duplicati from crashplan but after the first sftp backup I have got the same issue. I have made a backup of ~300 GB of data to a sftp server but during the backup my computer rebooted due to a windows update. I don’t know if that has something to do with this but i think i had to rebuild the database afterwards. However, I have verified the size of the file by using WinSCP to connect to the server and also directly in windows on the host machine and it is 44493660 as stated in the warning messages. SHA-256 is 2d8582625abd5ad4a1d81c1bb772f39d243434080024e1de683b55df13aff5ac according to sftp server and WinSCP. That’s not the same as in the warning messages.

The log:

DeletedFolders: 0
ModifiedFiles: 2
ExaminedFiles: 42400
OpenedFiles: 2
AddedFiles: 0
SizeOfModifiedFiles: 0
SizeOfAddedFiles: 0
SizeOfExaminedFiles: 374410675997
SizeOfOpenedFiles: 11098
NotProcessedFiles: 0
AddedFolders: 0
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
CompactResults:
    DeletedFileCount: 278
    DownloadedFileCount: 0
    UploadedFileCount: 0
    DeletedFileSize: 7285059022
    DownloadedFileSize: 0
    UploadedFileSize: 0
    Dryrun: False
    MainOperation: Compact
    ParsedResult: Success
    EndTime: 2018-01-03 07:04:33
    BeginTime: 2018-01-03 07:04:07
    Duration: 00:00:25.4041760
    BackendStatistics:
        RemoteCalls: 286
        BytesUploaded: 4842823
        BytesDownloaded: 57254007
        FilesUploaded: 3
        FilesDownloaded: 3
        FilesDeleted: 278
        FoldersCreated: 0
        RetryAttempts: 0
        UnknownFileSize: 0
        UnknownFileCount: 0
        KnownFileCount: 13225
        KnownFileSize: 346541677172
        LastBackupDate: 2018-01-03 08:00:52
        BackupListCount: 3
        TotalQuotaSpace: 0
        FreeQuotaSpace: 0
        AssignedQuotaSpace: -1
        ParsedResult: Success
DeleteResults:
    DeletedSets: []
    Dryrun: False
    MainOperation: Delete
    ParsedResult: Success
    EndTime: 2018-01-03 07:04:33
    BeginTime: 2018-01-03 07:03:52
    Duration: 00:00:40.8690253
RepairResults: null
TestResults:
    MainOperation: Test
    Verifications: [
        Key: duplicati-20180103T070052Z.dlist.zip.aes
        Value: [],
        Key: duplicati-ibe3266c190ef4ce8825c9889024f89e8.dindex.zip.aes
        Value: [],
        Key: duplicati-bdd997f12e438448cb7242f4df0c42055.dblock.zip.aes
        Value: []
    ]
    ParsedResult: Success
    EndTime: 2018-01-03 07:05:58
    BeginTime: 2018-01-03 07:04:40
    Duration: 00:01:18.0469487
ParsedResult: Warning
EndTime: 2018-01-03 07:05:58
BeginTime: 2018-01-03 07:00:52
Duration: 00:05:05.7192388
Messages: [
    No remote filesets were deleted,
    Compacting because there are 139 fully deletable volume(s),
    Deleted 278 files, which reduced storage by 6,78 GB,
    removing file listed as Temporary: duplicati-b80e72fb2a3834a4da17616097735c630.dblock.zip.aes,
    removing file listed as Temporary: duplicati-ief8fb87b586e40fe9d9f47414c5d8003.dindex.zip.aes
]
Warnings: [
    remote file duplicati-b1e57936bafbd4f25a0678cf6c5d0303b.dblock.zip.aes is listed as Verified with size 44493660 but should be 52427501, please verify the sha256 hash "6X37MDa+Mw2uGm/P569ExKKiN3dIIW69LDbndprv9hY=",
    remote file duplicati-b1e57936bafbd4f25a0678cf6c5d0303b.dblock.zip.aes is listed as Verified with size 44493660 but should be 52427501, please verify the sha256 hash "6X37MDa+Mw2uGm/P569ExKKiN3dIIW69LDbndprv9hY="
]
Errors: []

How should I proceed to solve this? I have tried to restore a file that that worked but I got the same warning.

Most likely the remote file is broken, since it is missing some data.

I suggest you remove the file and then run the purge-broken-files command from the “Commandline” area.

I realize this would be a decent chunk of work but would it make sense to have the GUI provide a way to let Duplicati remove the remote file rather than needed a user to do it (and risk killing the wrong file)?

An added benefit could be a notice of what files/versions would be affected by the loss of that remote file…

Thanks that solved my problem!

Yes, it simply requires that I (or someone else) provides the UI for “purge-broken-files”, which I think we should do anyway.

Glad to hear you got things working again!

Let me know if I got it wrong, but I went ahead and flagged the post to which I think you were referring when you said your problem was solved. :slight_smile:

Is duplicati not supported backing up to a NAS mounted by CIFS? I went back to the beta version, removed all the old backup files from the destination, recreated the database, and started backing up from scratch and I’m still getting hundreds of these errors on one job. It appears every file is failing with the “please verify the sha256 hash” error now.

I’m thinking duplicati doesn’t like something in my new configuration but I don’t understand what to do. If the remote file is a different size than expected, can duplicati reconcile the difference and just say “OK now the file is X size, it’s fine”?

Edit - I just looked and 3 of my backup jobs are having this problem. The only one that is not is a small DAVFS job. The destination is the same, but the source is DAVFS instead of CIFS. But the error messages seem to imply that the problem is on the storage/destination side, not the source side, so I don’t know if that means anything that one job is working (maybe just because it’s much smaller than the others).

Edit 2 - check that - another job that is failing is smaller than the job that is working, so it doesn’t appear to be a backup size issue.

By the way JonMikeIV I really appreciate your input on all my recent threads as I try to get Duplicati working but this issue is not solved - you were replying to Kenny in this thread, and he apparently solved his issue, but I still have not managed to get rid of this error.

To do what Kenny did, does that mean I have to delete the entire remote backup archive (aes files) and run purge-broken-files? Does that mean it will wipe out all my backup for this job and start over?

Deleting the aes files would indeed be starting from scratch. You’d be better off deleting the job - or at least creating (export/import?) a new one to a different folder on your destination.

I’m a little confused - are the sources DAVFS and CIFS (so they’re not local to the machine running Duplicati) or the destinations (or both)?

It is supposed to work. The error messages seem to indicate that the files are somehow truncated after being stored.

Duplicati basically does a file copy from the temporary folder to the destination, so I would assume that you could replicate a similar problem if you copied a file to the CIFS destination manually?