Errors and warnings after "successful" backup job?

I have a couple of warnings and a couple of errors that pop up on two of my backups. They are shown even those those backups claim to have finished “successfully”. I would like to make sure that what I’m seeing is not serious, or at least doesn’t affect the stability of the backup.

First, the WARNING. I’m pretty sure its innocuous becuase there’s something about empty metadata rather than some file? In any case, the following is an example of the type of warning I’m seeing:

Nov 27, 2019 12:02 AM: Result 
DeletedFiles: 1 
DeletedFolders: 0 
ModifiedFiles: 1 
ExaminedFiles: 36027 
OpenedFiles: 12 
AddedFiles: 11 
SizeOfModifiedFiles: 103710 
SizeOfAddedFiles: 982714 
SizeOfExaminedFiles: 20194206331 
SizeOfOpenedFiles: 1086424 
NotProcessedFiles: 0 
AddedFolders: 0 
TooLargeFiles: 0 
FilesWithError: 0 
ModifiedFolders: 0 
ModifiedSymlinks: 0 
AddedSymlinks: 0 
DeletedSymlinks: 0 
PartialBackup: False 
Dryrun: False 
MainOperation: Backup 
CompactResults: 
    DeletedFileCount: 2 
    DownloadedFileCount: 0 
    UploadedFileCount: 0 
    DeletedFileSize: 1914 
    DownloadedFileSize: 0 
    UploadedFileSize: 0 
    Dryrun: False 
    MainOperation: Compact 
    ParsedResult: Warning 
    Version: 2.0.4.5 (2.0.4.5_beta_2018-11-28) 
    EndTime: 11/27/2019 12:01:21 AM (1574830881) 
    BeginTime: 11/27/2019 12:01:18 AM (1574830878) 
    Duration: 00:00:02.9266682 
    Messages: [ 
        2019-11-27 00:00:00 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started, 
        2019-11-27 00:00:09 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (), 
        2019-11-27 00:00:27 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (717 bytes), 
        2019-11-27 00:01:03 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b4b659c409149408ab68b093bfc360440.dblock.zip.aes (667.76 KB), 
        2019-11-27 00:01:07 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b4b659c409149408ab68b093bfc360440.dblock.zip.aes (667.76 KB), 
... 
    ] 
    Warnings: [ 
        2019-11-27 00:00:55 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.MetadataGenerator.Metadata-MetadataProcessFailed]: Failed to process metadata for "<<...>>\Documents\BookSmartData\kris birthday book\library\1ea72ce3-548a-4a82-91ac-2cc596a19988.original", storing empty metadata, 
        2019-11-27 00:00:55 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: <<...>>\Documents\BookSmartData\kris birthday book\library\1ea72ce3-548a-4a82-91ac-2cc596a19988.original 
    ] 

While the following is the type of error I’m seeing. :slight_smile:

DeletedFiles: 0
DeletedFolders: 0
ModifiedFiles: 0
ExaminedFiles: 61044
OpenedFiles: 0
AddedFiles: 0
SizeOfModifiedFiles: 0
SizeOfAddedFiles: 0
SizeOfExaminedFiles: 191264017084
SizeOfOpenedFiles: 0
NotProcessedFiles: 0
AddedFolders: 0
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
CompactResults: null
DeleteResults:
    DeletedSets: []
    Dryrun: False
    MainOperation: Delete
    CompactResults: null
    ParsedResult: Error
    Version: 2.0.4.5 (2.0.4.5_beta_2018-11-28)
    EndTime: 12/1/2019 2:03:26 AM (1575183806)
    BeginTime: 12/1/2019 2:03:09 AM (1575183789)
    Duration: 00:00:16.6621065
    Messages: [
        2019-12-01 02:00:00 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started,
        2019-12-01 02:00:10 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),
        2019-12-01 02:02:16 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (13.27 KB),
        2019-12-01 02:03:09 -05 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed,
        2019-12-01 02:03:09 -05 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00,
...
    ]
    Warnings: []
    Errors: [
        2019-12-01 02:06:33 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190616T070000Z.dlist.zip.aes,
        2019-12-01 02:07:51 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190804T070002Z.dlist.zip.aes,
        2019-12-01 02:09:10 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190908T070000Z.dlist.zip.aes,
        2019-12-01 02:10:31 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191103T070003Z.dlist.zip.aes,
        2019-12-01 02:11:50 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191124T070000Z.dlist.zip.aes,
...
    ]
    BackendStatistics:
        RemoteCalls: 72
        BytesUploaded: 0
        BytesDownloaded: 790723838
        FilesUploaded: 0
        FilesDownloaded: 70
        FilesDeleted: 0
        FoldersCreated: 0
        RetryAttempts: 24
        UnknownFileSize: 0
        UnknownFileCount: 0
        KnownFileCount: 13584
        KnownFileSize: 177480612080
        LastBackupDate: 11/24/2019 2:00:00 AM (1574578800)
        BackupListCount: 6
        TotalQuotaSpace: 1104880336896
        FreeQuotaSpace: 833139431124
        AssignedQuotaSpace: -1
        ReportedQuotaError: False
        ReportedQuotaWarning: False
        ParsedResult: Error
        Version: 2.0.4.5 (2.0.4.5_beta_2018-11-28)
        Messages: [
            2019-12-01 02:00:00 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started,
            2019-12-01 02:00:10 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),
            2019-12-01 02:02:16 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (13.27 KB),
            2019-12-01 02:03:09 -05 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed,
            2019-12-01 02:03:09 -05 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00,
...
        ]
        Warnings: []
        Errors: [
            2019-12-01 02:06:33 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190616T070000Z.dlist.zip.aes,
            2019-12-01 02:07:51 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190804T070002Z.dlist.zip.aes,
            2019-12-01 02:09:10 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190908T070000Z.dlist.zip.aes,
            2019-12-01 02:10:31 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191103T070003Z.dlist.zip.aes,
            2019-12-01 02:11:50 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191124T070000Z.dlist.zip.aes,
...
        ]
RepairResults: null
TestResults:
    MainOperation: Test
    Verifications: [
        Key: duplicati-20190616T070000Z.dlist.zip.aes
        Value: [
            Key: Error
            Value: Hash mismatch on file "%TEMP%\dup-7d8fcbf4-3aad-4a98-9a43-85b1fc010b0f", recorded hash: UoGtUcOHV9AXWupkOzUW1p+ghPDVwPcq/3ylOu2uBO0=, actual hash L7dD/9ZL9HfT+SXIB22ixOIih1WpEh95+Xom012Y0F4=
        ],
        Key: duplicati-20190804T070002Z.dlist.zip.aes
        Value: [
            Key: Error
            Value: Hash mismatch on file "%TEMP%\dup-f7cb2c4e-90b9-41bf-8eaf-e336f85eddba", recorded hash: UUo6mknmU5uMn9KY3hRT6FkvYkGwPmuKyMJpY7fPEMs=, actual hash xGfSoVH/OBMJcgdBxeuKr1Udy2BzPjoMFXQLi8FqHSg=
        ],
        Key: duplicati-20190908T070000Z.dlist.zip.aes
        Value: [
            Key: Error
            Value: Hash mismatch on file "%TEMP%\dup-549d286e-1591-42c5-98a2-bd402ea2775b", recorded hash: Tblpj6iv2w4ddt+dUP56mkxB66KN4fm2bd7siFlgD1E=, actual hash ikkUhmd/3akqqGDc5uJXCqweoFwEeIW4dQqSaMeo9Qw=
        ],
        Key: duplicati-20191103T070003Z.dlist.zip.aes
        Value: [
            Key: Error
            Value: Hash mismatch on file "%TEMP%\dup-2da56840-61de-4dda-854a-fc216f78803d", recorded hash: OCDN3y9ZDpAwpXkqUdU2DTpZKEY3ScmQsmN7GrX/wsA=, actual hash S3atQ8MOQgRChlU5rl288KDbyYbPIEI93D9OgjeBlNI=
        ],
        Key: duplicati-20191124T070000Z.dlist.zip.aes
        Value: [
            Key: Error
            Value: Hash mismatch on file "%TEMP%\dup-a0da7821-8703-4b24-a803-982674c0df39", recorded hash: f8BPkSosFGEvr7fF+awqkmvTWx4UoVJqv9x7rdhTr8c=, actual hash y/qye18BuEX+KE0+cp6nzGR0L7JulbAT+2Xs+nvIu3c=
        ],
...
    ]
    ParsedResult: Error
    Version: 2.0.4.5 (2.0.4.5_beta_2018-11-28)
    EndTime: 12/1/2019 2:25:49 AM (1575185149)
    BeginTime: 12/1/2019 2:05:15 AM (1575183915)
    Duration: 00:20:34.5288014
    Messages: [
        2019-12-01 02:00:00 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started,
        2019-12-01 02:00:10 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),
        2019-12-01 02:02:16 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (13.27 KB),
        2019-12-01 02:03:09 -05 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed,
        2019-12-01 02:03:09 -05 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00,
...
    ]
    Warnings: []
    Errors: [
        2019-12-01 02:06:33 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190616T070000Z.dlist.zip.aes,
        2019-12-01 02:07:51 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190804T070002Z.dlist.zip.aes,
        2019-12-01 02:09:10 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190908T070000Z.dlist.zip.aes,
        2019-12-01 02:10:31 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191103T070003Z.dlist.zip.aes,
        2019-12-01 02:11:50 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191124T070000Z.dlist.zip.aes,
...
    ]
ParsedResult: Error
Version: 2.0.4.5 (2.0.4.5_beta_2018-11-28)
EndTime: 12/1/2019 2:25:51 AM (1575185151)
BeginTime: 12/1/2019 2:00:00 AM (1575183600)
Duration: 00:25:51.1571890
Messages: [
    2019-12-01 02:00:00 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started,
    2019-12-01 02:00:10 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),
    2019-12-01 02:02:16 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (13.27 KB),
    2019-12-01 02:03:09 -05 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed,
    2019-12-01 02:03:09 -05 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00,
...
]
Warnings: []
Errors: [
    2019-12-01 02:06:33 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190616T070000Z.dlist.zip.aes,
    2019-12-01 02:07:51 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190804T070002Z.dlist.zip.aes,
    2019-12-01 02:09:10 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190908T070000Z.dlist.zip.aes,
    2019-12-01 02:10:31 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191103T070003Z.dlist.zip.aes,
    2019-12-01 02:11:50 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191124T070000Z.dlist.zip.aes,
...
]

It seems related to Error Message following successful backup job, or Database error backup or Errors on backup but they don’t seem to shed any light.

This is a backup to OneDrive. Should I be worried?

Is there a way to test whether these errors and warnings are significant? I assume one way would be to recover the files to a separate folder / partition and then run a deep compare on the files. But… the one with errors is 180 GB large, so that would take a while to download from a cloud service provider. Is there any other way of testing the fidelity of the backup after the errors?

Thanks!

In your first log, it seems one file was not able to be read properly. You won’t be able to restore that file from that backup job, but all other files should be restorable. Reason it wasn’t backed up could be one of a few things: permission issue, file is locked, file was deleted and no longer exists, etc. Do you see that file present if you look for it in Windows Explorer?

In your second log, looks like Duplicati failed to process back end files when it was running a test on them. Does that happen every time you do a backup? Have you ever tried doing a test restore?

Thanks @drwtsn32 – the Warning I’m happy to ignore then. I’ll double check the file but even if it isn’t backing up, that specific file I’m unlikely to care more about.

The errors do seem to occur on every single back up I’ve done, though I’m not sure since I don’t check. I remember seeing something similar occasionally at least for some time now. Is there a way to check historical logs?

I have not tried doing a test restore, since that’s a backup that’s a 180 GB or so… I’d prefer to avoid that going over the network for obvious reasons. Would a partial restore be effective? Or would I have to be lucky and pick files that happened to be saved in those blocks?

Should I also try some of the database repair things listed here: Using the Graphical User Interface - Duplicati 2 User's Manual?

In other words, here is a list of things to try in no particular order:

Partial restore
Full restore
Database repairs?

Did I miss anything?

Hello,

Just to be clear, I don’t recommend ignoring all Warning messages. They are often important indications of a problem. It’s just that sometimes they aren’t a huge deal. In most cases you should be able to fix the root problem so that they rarely show up.

The errors do seem to occur on every single back up I’ve done

That’s not good - we should figure out the issue and resolve it.

Is there a way to check historical logs?

Yes, in the main Duplicati UI, click on the back set name, then click the “Show Log” link. You should see a screen like this where you can view the recent backup logs:

I have not tried doing a test restore

You should do it, and I am just talking about restoring one or two files as a test. It makes me nervous that Duplicati is failing when it is doing the back end verification at the end of a job. That’s a safety check to make sure the backed up data is readable. Let us know what happens after you do a small test restore.

Should I also try some of the database repair

It should not be necessary to run a repair unless Duplicati actually prompts you to do so. A database repair is for fixing other types of issues, not Warnings or Errors that happen during a backup job.

Thanks

Sorry, I didn’t mean to imply I would ignore them, just that they weren’t top priority. However, I just checked things and that file inexplicably had permissions turned off from inheriting from parent and turned read permissions off. No wonder it was throwing a warning.

I looked at the logs. Even though I’ve been backing up for quite some time (year+?), I only have 5 past logs visible. They’re weekly backups. The errors are visible on every log I have. Interestingly, it is having trouble with the same exact file?

That is, from the December 8 backup:

Errors: [
        2019-12-08 02:07:11 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190616T070000Z.dlist.zip.aes,
        2019-12-08 02:08:31 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190804T070002Z.dlist.zip.aes,
        2019-12-08 02:09:50 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190908T070000Z.dlist.zip.aes,
        2019-12-08 02:11:10 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191103T070003Z.dlist.zip.aes,
        2019-12-08 02:12:32 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191124T070000Z.dlist.zip.aes,
...
    ]

From the December 1 backup:

Errors: [
        2019-12-01 02:06:33 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190616T070000Z.dlist.zip.aes,
        2019-12-01 02:07:51 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190804T070002Z.dlist.zip.aes,
        2019-12-01 02:09:10 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20190908T070000Z.dlist.zip.aes,
        2019-12-01 02:10:31 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191103T070003Z.dlist.zip.aes,
        2019-12-01 02:11:50 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20191124T070000Z.dlist.zip.aes,
...
    ]

The last entry changes slightly the further back you go, but it started appearing on the 11/24 backup.

So one interesting thing is that when I went to restore, I noticed this correlation:
image
Note that each of these entries corresponds to a timestamp on the error.

In any case, I picked two files to test the restore process on from the Nov 3, 2019 backup set. It successfully restored those files. In case it is helpful, here is the log from that:

FilesRestored: 2
SizeOfRestoredFiles: 6868756
FoldersRestored: 0
SymlinksRestored: 0
FilesPatched: 0
FilesDeleted: 0
FoldersDeleted: 0
SymlinksDeleted: 0
MainOperation: Restore
RecreateDatabaseResults: null
ParsedResult: Success
Version: 2.0.4.23 (2.0.4.23_beta_2019-07-14)
EndTime: 12/11/2019 6:22:08 AM (1576063328)
BeginTime: 12/11/2019 6:18:23 AM (1576063103)
Duration: 00:03:44.3181762
Messages: [
    2019-12-11 06:18:23 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Restore has started,
    2019-12-11 06:18:26 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),
    2019-12-11 06:20:52 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (13.27 KB),
    2019-12-11 06:20:53 -05 - [Information-Duplicati.Library.Main.Database.LocalRestoreDatabase-SearchingBackup]: Searching backup 2 (11/3/2019 7:00:03 AM) ...,
    2019-12-11 06:21:02 -05 - [Information-Duplicati.Library.Main.Operation.RestoreHandler-RemoteFileCount]: 2 remote files are required to restore,
...
]
Warnings: []
Errors: []
BackendStatistics:
    RemoteCalls: 3
    BytesUploaded: 0
    BytesDownloaded: 52405338
    FilesUploaded: 0
    FilesDownloaded: 2
    FilesDeleted: 0
    FoldersCreated: 0
    RetryAttempts: 0
    UnknownFileSize: 0
    UnknownFileCount: 0
    KnownFileCount: 13587
    KnownFileSize: 177486898775
    LastBackupDate: 12/8/2019 2:00:03 AM (1575788403)
    BackupListCount: 7
    TotalQuotaSpace: 1104880336896
    FreeQuotaSpace: 833130616643
    AssignedQuotaSpace: -1
    ReportedQuotaError: False
    ReportedQuotaWarning: False
    ParsedResult: Success
    Version: 2.0.4.23 (2.0.4.23_beta_2019-07-14)
    Messages: [
        2019-12-11 06:18:23 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Restore has started,
        2019-12-11 06:18:26 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  (),
        2019-12-11 06:20:52 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (13.27 KB),
        2019-12-11 06:20:53 -05 - [Information-Duplicati.Library.Main.Database.LocalRestoreDatabase-SearchingBackup]: Searching backup 2 (11/3/2019 7:00:03 AM) ...,
        2019-12-11 06:21:02 -05 - [Information-Duplicati.Library.Main.Operation.RestoreHandler-RemoteFileCount]: 2 remote files are required to restore,
...
    ]
    Warnings: []
    Errors: []

I actually did two restores, from two different backup sets (Sept 8 set). Both seem to have worked fine (I even tried to restore different files). On one i had files from the same folder so they were restored as is to the desktop. The other one I chose different folder paths, and so I got warnings from Duplicati that it created the folder structure for the files to be restored to.

Hmm, so what now? And thank you for helping out!

This sounds normal. I believe Duplicati only retains a month’s worth of backup logs by default.

This isn’t surprising if Duplicati is always complaining about the dlist file. There is one dlist file per backup version, and the timestamp in the filename coresponds to the timestamp of the backup.

I do not know why you are getting a consistent error in the TestHandler. I might recommend starting a backup then go watch the logs in: About → Show Log → Live → Verbose. Maybe you’ll be able to glean additional detail there.

By the way, what back end are you using?

For what it’s worth, the other backup with the warnings just ran and had no warnings this time, so the file permission issue was the culprit. Thanks!

OneDrive Personal

So I’ve done that, and I don’t see anything that’s obvious, but maybe you will. I can send the whole log, but here is a snippet from it trying two different files. There’s a bunch of these types of errors (I changed the path to slightly obfuscate my hdd structure):

    {"ClassName":"Duplicati.Library.Main.BackendManager+HashMismatchException","Message":"Hash mismatch on file \"%TEMP%\\dup-76a5acaa-d53f-4f88-b75d-d894c34aebd1\", recorded hash: Tblpj6iv2w4ddt+dUP56mkxB66KN4fm2bd7siFlgD1E=, actual hash ikkUhmd/3akqqGDc5uJXCqweoFwEeIW4dQqSaMeo9Qw=","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":"   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)\r\n   at Duplicati.Library.Main.BackendManager.ThreadRun()","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":"8\nDoGet\nDuplicati.Library.Main, Version=2.0.4.23, Culture=neutral, PublicKeyToken=null\nDuplicati.Library.Main.BackendManager\nVoid DoGet(FileEntryItem)","HResult":-2146233088,"Source":"Duplicati.Library.Main","WatsonBuckets":null}
    Dec 11, 2019 8:16 PM: Backend event: Get - Completed: duplicati-20190908T070000Z.dlist.zip.aes (5.96 MB)
    Dec 11, 2019 8:16 PM: Backend event: Get - Started: duplicati-20190908T070000Z.dlist.zip.aes (5.96 MB)
    Dec 11, 2019 8:16 PM: Failed to process file duplicati-20190804T070002Z.dlist.zip.aes
    {"ClassName":"Duplicati.Library.Main.BackendManager+HashMismatchException","Message":"Hash mismatch on file \"%TEMP%\\dup-783935bc-7e4f-4eb8-ab73-54498d6f94be\", recorded hash: UUo6mknmU5uMn9KY3hRT6FkvYkGwPmuKyMJpY7fPEMs=, actual hash xGfSoVH/OBMJcgdBxeuKr1Udy2BzPjoMFXQLi8FqHSg=","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":"   at Duplicati.Library.Main.BackendManager.GetForTesting(String remotename, Int64 size, String hash)\r\n   at Duplicati.Library.Main.Operation.TestHandler.DoRun(Int64 samples, LocalTestDatabase db, BackendManager backend)","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":"8\nGetForTesting\nDuplicati.Library.Main, Version=2.0.4.23, Culture=neutral, PublicKeyToken=null\nDuplicati.Library.Main.BackendManager\nVoid GetForTesting(System.String, Int64, System.String)","HResult":-2146233088,"Source":"Duplicati.Library.Main","WatsonBuckets":null}
    Dec 11, 2019 8:16 PM: Backend event: Get - Failed: duplicati-20190804T070002Z.dlist.zip.aes (5.96 MB)
    Dec 11, 2019 8:16 PM: Operation Get with file duplicati-20190804T070002Z.dlist.zip.aes attempt 5 of 5 failed with message: Hash mismatch on file "%TEMP%\dup-783935bc-7e4f-4eb8-ab73-54498d6f94be", recorded hash: UUo6mknmU5uMn9KY3hRT6FkvYkGwPmuKyMJpY7fPEMs=, actual hash xGfSoVH/OBMJcgdBxeuKr1Udy2BzPjoMFXQLi8FqHSg=
    {"ClassName":"Duplicati.Library.Main.BackendManager+HashMismatchException","Message":"Hash mismatch on file \"%TEMP%\\dup-783935bc-7e4f-4eb8-ab73-54498d6f94be\", recorded hash: UUo6mknmU5uMn9KY3hRT6FkvYkGwPmuKyMJpY7fPEMs=, actual hash xGfSoVH/OBMJcgdBxeuKr1Udy2BzPjoMFXQLi8FqHSg=","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":"   at Duplicati.Library.Main.BackendManager.GetForTesting(String remotename, Int64 size, String hash)\r\n   at Duplicati.Library.Main.Operation.TestHandler.DoRun(Int64 samples, LocalTestDatabase db, BackendManager backend)","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":"8\nGetForTesting\nDuplicati.Library.Main, Version=2.0.4.23, Culture=neutral, PublicKeyToken=null\nDuplicati.Library.Main.BackendManager\nVoid GetForTesting(System.String, Int64, System.String)","HResult":-2146233088,"Source":"Duplicati.Library.Main","WatsonBuckets":null}
    Dec 11, 2019 8:16 PM: Backend event: Get - Completed: duplicati-20190804T070002Z.dlist.zip.aes (5.96 MB)
    Dec 11, 2019 8:16 PM: Backend event: Get - Started: duplicati-20190804T070002Z.dlist.zip.aes (5.96 MB)
    Dec 11, 2019 8:16 PM: Backend event: Get - Retrying: duplicati-20190804T070002Z.dlist.zip.aes (5.96 MB)

There’s a bunch of these type information logs:

Dec 11, 2019 8:11 PM: Including path as no filters matched: %SOMEFILE%
Dec 11, 2019 8:11 PM: Skipped checking file, because no metadata was updated %SOMEFILE%

But I suspect that’s just normal operation…

Do you need more of the log?

Any suggestions on what to try next?

--log-retention = 30D
Set the time after which log data will be purged from the database.

The default –log-retention might explain that, however what’s gone is gone unless you saved the information somehow, for example by using reporting options, your own –log-file, or DB backups.

On that note, please do not attempt to Recreate or Repair the DB until it’s found whether you got damaged dlist files. They can be rebuilt from the DB if needed, but only if the DB has the old info.

Before starting there, do you ever throttle the upload or download speed by advanced options or through use of the throttle controls at the top of the web UI screen (to the right of the status bar)?

You can get a second opinion on your dlist files in a couple of ways. One simple one not involving Duplicati at all is to download from OneDrive then see if AES Crypt can decrypt them successfully.

Same thing can be done with Duplicati’s included SharpAESCrypt.exe command if you prefer CLI.

Decrypt failure probably means damage (somehow). If it decrypted, feel free to peek at filelist.json, which is the map to your backup, which is why I don’t want you to lose these, or the backup is lost.

If you’d rather have Duplicati web UI do the work, you can open another tab to do direct restore to determine whether it can read your dlist files. Copy parameters from backup tab, or other records. Getting as far as being able to see the contents of the questionable backups means dlist read OK.

I’m not sure what shape your other files are in. The automatic verification sample after backup will include dindex and dblock files as well, however since all I see so far are dlist, let’s start with them. Heavy sampling for file integrity test is possible, all the way up to all your 180 GB, but let’s hold off.

Yes, I throttle uploads to 1 MBps for Duplicati using the interface like you mentioned. Now, to be honest that’s probably above what my ISP provides anyway, but yes, it is throttled.

Ok, so if I’m following you, I should

  1. Download one of the files that’s throwing an error
  2. Launch a CMD/PS instance and run SharpAESCrypt.exe on it to decrypt the dlist file. The password for that is presumably the password I set for this particular backup?. This should provide a JSON formatted file.
  3. Look through the JSON formatted file and see if there is somethign wrong there?

OR (if I want to just use the GUI), I can just use the direct restore functionality?

I’ll try to give one or the other a shot over the weekend. Thanks for the pointers of additional tests to run.

Happy New Year!

Password should be the usual one. If it’s bad, it will probably blow up there. If it survives, unzip it. Examining the filelist.json file inside the zip is probably not necessary and would be some work…

Ok, I used SharpAESCrypt.exe and it complains of Error: Invalid header marker with an exit code of 1. It does so on three of three dlist files I tried. That seems like it’s a problem…

So…, now what? Thanks again for your help!

Upload throttle corrupts backup, especially OneDrive. Analyzed, with code proposed. #3787
probably is what happened. It’d be more certain if you look over the file with a hex editor but regardless you’ve got some bad files. If you like, you can do the same test with a non-erroring dlist file to compare.

You did the download some other way, e.g. from OneDrive web UI I hope? Otherwise there’s also a bug with throttled downloads. That one is less severe, because removing throttling clears it right up. Upload throttling bug results in a damaged file in the backup. Bug was fixed in 2.0.4.22 Canary, but isn’t in Beta.

Assuming you have no throttling now or did the download outside of Duplicati, then the test was fine but you need to move the damaged files aside to a different folder (don’t destroy them just yet, just in case), go to the Database screen, and run Repair (assuming that you haven’t done database Recreate before).

If the backup is super-valuable we can take some more precautions in case something goes wrong with Repair. If it’s not particularly valuable, then another option is to recreate the backup from scratch without throttling, or pick up a release with the throttling fix and keep throttling. I’m worried about other files that might be damaged that haven’t been found yet. A full scan is possible but it means downloading all files. The dlist file recreate should be very quick, and we can hope that Repair doesn’t do anything unwanted.

Any thoughts on which way you’d like to go to get the backup healthy? Your concern about the 180 GB destination and its download time is reasonable. If you want to play it safe, you could have OneDrive’s web UI make a remote copy of the backup, and you could backup your local database on the PC. That way, if anything goes wrong in Repair, the entire backup can be reset to how it was for another attempt.

1 Like

Yes, I downloaded the dlist files through the OneDrive online interface.

The backup is not super-valuable in the sense that it backups up my photos. It is meant as the ‘offsite’ backup part of the strategy, since digital memories are one of the most important data I have at home. But, it is meant as a ‘last resort’ backup, since I have the photos duplicated on multiple computers. As far as I can tell, I’ve not lost photos yet, so I’m not worried about losing the backup.

So, as I understand it, you’re saying:

  1. Turn off throttling
  2. Move the duplicati-TIMESTAMPHERE.dlist.zip.aes files to a different folder
  3. Run “repair” from the database / advanced settings page

I’ll do that and see what happens. There’s another backup scheduled for tomorrow so could be a good time to do so.

One easy thing you could also do to make extra sure that the local database has the information that the dlist file will need, is to go to the backup job’s Restore screen, and at least make sure that date is visible.

What should happen on Repair is that the database’s information will replace the missing dlist with a new timestamped dlist file saying what that particular backup was supposed to have. The dlist files are usually not needed for restore because the database has the same info locally, but if the drive fails, DB is gone…

1 Like

Ran the repair on the database and looking at the server logs seems like things went without an obvious error. If the backup overnight works well, should I just try the same AES decryption to see if that now works?

You don’t need to wait. The repair should have replaced the dlist files you moved. You can sort by time on your OneDrive files, and probably find them all together, and I’m not sure if Repair would touch other files.

You can check them if you want to see whether they do the AES decryption OK, but they “should” be fine.

1 Like

Looks like the dlist files are back where they should be, and the next backup didn’t return any errors. Does that mean I should be fine going forward?

Thanks!

but an alternative would be to rely on smaller scans which might be more tolerable. Ordinarily the –backup-test-samples is only 1 set of three files per backup. You can raise it in Advanced options. There’s also an option that’s not in the manual yet, but has actually been out awhile. It works like:

  --backup-test-percentage (Integer): The percentage of samples to test after
    a backup
    After a backup is completed, some (dblock, dindex, dlist) files from the
    remote backend are selected for verification. Use this option to specify
    the percentage (between 0 and 100) of files to test. If the
    backup-test-samples option is also provided, the number of samples tested
    is the maximum implied by the two options. If the no-backend-verification
    option is provided, no remote files are verified.
    * default value: 0

Sampling is balanced so eventually everything should be covered. There’s no way to verify the other files are intact without reading them, but you have a choice of how rapidly you want to do verification.

Reading remote files can also be done for Restore, especially with Advanced option –no-local-blocks.
Even better if you have another system with the same OS is direct restore to try out disaster recovery.

There is a bug in 2.0.4.5 that may make it slow on database creation, so you might consider 2.0.5.0 if doing the direct restore. You can uninstall Duplicati if you don’t want to keep it. It’s just for restore test.

You did say that this wasn’t a super-valuable backup, so perhaps slow verification would be sufficient, but even without any known bumps like this unfortunate one, testing actual restores is best practice…

Whole restore might be kind of painful due to size, but so maybe a sample smaller restore plus setting post-backup verification test (process is described in more depth here) higher would be a good combo.

1 Like