Duplicati might have deleted files it shouldn't have?

I got the “Failed to decrypt data (invalid passphrase?): Message has been altered, do not trust content” error message for a backup job yesterday. The job has been running fine daily for a month until yesterday. I didn’t have time to look into it until today, after the job tried to run again, and it failed today with “Failed: Found 2 files that are missing from the remote storage, please run repair”.

After reading some threads here, I ran the command line “test” command with “–full-remote-verification”, and I got this result:

Running commandline entry
Finished!

Listing remote folder …
Missing file: duplicati-i7bb4b4b117ef4638bd98332ad1260dd6.dindex.zip.aes
Missing file: duplicati-i366536a1a69844d999b7dbd3fcbd0813.dindex.zip.aes
Found 2 files that are missing from the remote storage, please run repair

ErrorID: MissingRemoteFiles
Found 2 files that are missing from the remote storage, please run repair
Return code: 100

The backup destination is dropbox, so I logged in and looked at the folder and activity. Dropbox reports that yesterday:

You deleted duplicati-i366536a1a69844d999b7dbd3fcbd0813.dindex.zip.aes

12:31 PM

and

You deleted duplicati-i7bb4b4b117ef4638bd98332ad1260dd6.dindex.zip.aes

12:31 PM

The problem is, I most certainly did not delete those files. I use Dropbox’s “selective sync” option so that the Duplicati destination folder isn’t even accessible from any of my devices via the GUI, since it would fill up my local hard disk. So how did these files get deleted when Duplicati is still expecting to see them? The only commands I’ve run so far are “test” and I tried “list-broken-files” but it came back with result 0 (no issues?). I am really confused about this situation, because I’m assuming Duplicati wouldn’t remove files unless it knew it didn’t need them any longer.

On how many files, e.g. a short sample or all (which will download entire backup, so might be slow)?

What other files with duplicati-* names were deleted? Usually dindex and its dblock delete in pairs. Seeing a dlist file deletion would also be a clue, but logs are even better. Anything in job logs or About → Show log → Stored? Try to match whatever you find with the actions you were taking at the time of error. Sometimes no logs are generated, for example if system is restarted while Duplicati is doing something.

Are you comfortable either looking in the database (it’s like a spreadsheet) or posting link to a bug report?

I’m pretty sure I ran with --full-remote-verification=true, which I just learned means it samples 30% of contents from each dblock.

The full list of file edits from that job shows a mix of dblock, dindex, and dlist files:

You deleted duplicati-i366536a1a69844d999b7dbd3fcbd0813.dindex.zip.aes
You deleted duplicati-b34ea4703b21446cf9a1bb4d2f2ab9bae.dblock.zip.aes
You deleted duplicati-i7bb4b4b117ef4638bd98332ad1260dd6.dindex.zip.aes
You deleted duplicati-b85c05a99e2db4da59953fcfc391ff31f.dblock.zip.aes
You added duplicati-ib672a96d3a8d4cae8486dfbcce2314ba.dindex.zip.aes
You added duplicati-b708d307742b64f3d87cb8fe0dbe4e050.dblock.zip.aes
You deleted duplicati-i4cfd38a5205c4a46b3db3b0f40e937fa.dindex.zip.aes
You deleted duplicati-b4fc263a0d4284a5f93f2bdcfdd2bc29f.dblock.zip.aes
You added duplicati-i15ef0b86db22434592e420c75e2e10d9.dindex.zip.aes
You added duplicati-b3b19ca271f584534a4c197dff78940f8.dblock.zip.aes
You deleted duplicati-i01604d36733346418f30e2480bd37b09.dindex.zip.aes
You deleted duplicati-bb3bbbef6034643f0aeaf9ed3e44f88a9.dblock.zip.aes
You deleted duplicati-i140ccc4462434fa597d7a41daf7d938d.dindex.zip.aes
You deleted duplicati-bc1af3d803502412f97ddcf2a2f33da29.dblock.zip.aes
You deleted duplicati-i45e99e2666f340f8855a5c26db5560a4.dindex.zip.aes
You deleted duplicati-bf1c2a9972c82458da8bafac54804a26b.dblock.zip.aes
You added duplicati-i68e82cd3495845bab0b9dd7b9a6e113c.dindex.zip.aes
You added duplicati-b0dbdda2d0b354f4e930b7a898d3bbc0b.dblock.zip.aes
You deleted duplicati-i323de50d024b4802a927dcb86d82eb0e.dindex.zip.aes
You deleted duplicati-b9b127936d7f248f69ddd54af80e32bb0.dblock.zip.aes
You deleted duplicati-iaf7dbdf8bf8b47b28d31257cc973b3e8.dindex.zip.aes
You deleted duplicati-bd2b08753927746e1b46d78dbfede6129.dblock.zip.aes
You deleted duplicati-i229932c280b44969a3072673a5be606e.dindex.zip.aes
You deleted duplicati-bb91cb20c091246f7a54b03209bdc42cb.dblock.zip.aes
You added duplicati-ie3e6e49e858841dcaf31c381c4e960ab.dindex.zip.aes
You added duplicati-bfc9c29391b2a4e08afb2abd171488496.dblock.zip.aes
You deleted duplicati-i07358af52edd4ad4b58c20d0aef67d1e.dindex.zip.aes
You deleted duplicati-ba9cc9a3812ee45a49ccacd59775382e7.dblock.zip.aes
You deleted duplicati-ibccc6674190443a8812fe3c07fda8194.dindex.zip.aes
You deleted duplicati-beaa4333b73ac4b31bfb16b32628bd85b.dblock.zip.aes
You added duplicati-ibf7bb0633a124d3abd2bcc56059c62a6.dindex.zip.aes
You added duplicati-bfea57c3a403b48b9842d0115081415d6.dblock.zip.aes
You deleted duplicati-i5fdadc9d4a954cbab927317e4242e514.dindex.zip.aes
You deleted duplicati-b8822bb389daf4611a0413277d7b58c11.dblock.zip.aes
You added duplicati-ia88c0de13c4c496e87e2f5adb41f6b14.dindex.zip.aes
You added duplicati-b0fef29f1ae9c498b8c34c0de31df5bde.dblock.zip.aes
You deleted duplicati-20210930T190952Z.dlist.zip.aes
You added duplicati-20211010T191127Z.dlist.zip.aes
You added duplicati-i6dfdfd6441a940dcb418e470fbc0e31b.dindex.zip.aes
You added duplicati-b3721569259794fd894d50fea3bf40a7c.dblock.zip.aes

The duplicati stored log from that jobshows the typical stack trace; these are the first few lines, but I can’t understand a word of it and it doesn’t seem to mention any files.

System.Security.Cryptography.CryptographicException: Failed to decrypt data (invalid passphrase?): Message has been altered, do not trust content ---> SharpAESCrypt.SharpAESCrypt+HashMismatchException: Message has been altered, do not trust content
  at SharpAESCrypt.SharpAESCrypt.Read (System.Byte[] buffer, System.Int32 offset, System.Int32 count) [0x00405] in <45c43198e80545aeabaa9b98839b4c7d>:0 
  at Duplicati.Library.Utility.Utility.CopyStream (System.IO.Stream source, System.IO.Stream target, System.Boolean tryRewindSource, System.Byte[] buf) [0x0003d] in <5d5d65b1e6e4427684abed657dc45730>:0 
  at Duplicati.Library.Encryption.EncryptionBase.Decrypt (System.IO.Stream input, System.IO.Stream output) [0x00008] in <c36ef9933c0448c2a3eb2c5c27b6f69b>:0

Duplicati is running on an Ubuntu server and backing up via mounted SMB share. It had previously backed the same share up to a local NAS with no errors, and I don’t recall having any issues with the machine at during that backup. Nothing in the Windows Event log and nothing in syslog on the Ubuntu server other than when the drive was mounted for the backup and unmounted after.

I don’t think I have enough detail or knowledge to file a bug report; there’s a lot here I can’t answer because I don’t know what to save or what’s important. If there’s a debugging switch I can use to save better logs for later analysis I’d be happy to add it.

I run about 30 backup jobs to various destinations and this error pops up about once a month at random. Normally I just wipe the backup destination, delete the database, and start over, but since this is a remote destination and my upload speed is limited, I’m less inclined to do that.

In this case I tried deleting backups versions 0 and 1 from this job - I got some errors but then I deleted and recreated the database and the job appears to have run without errors today. Maybe there’s a command I can use to fully test/verify the entire backup set?

I’m skeptical about that, although it sounds similar to an undocumented full-block-verification behavior where that option looks like it boosts the sample percentage from default 20% of an examined dblock
to 100%. The prior post might have meant “each dblock sampled”. which I’ll speak to after some code:

The above is the boost, and below is what it does. See the description of the sample_percent parameter:

As for the “each” (of entire backup), unless you saw your entire backup downloaded, this is not possible.
Duplicati has no way to do a detailed examination of the contents of a dblock without first downloading it.

Above, one sees how the samples parameter controls the files to check, which are then downloaded.

The TEST command describes how to set the samples parameter:

If “all” is specified, all files in the backup will be tested.

That’s Commandline arguments box. Using empty box with full-remote-verification did 1 sample:

  Listing remote folder ...
  Downloading file (931.91 KB) ...
  Downloading file (9.33 KB) ...
  Downloading file (4.91 MB) ...
Examined 3 files and found no errors

full-remote-verification

After a backup is completed, some files are selected for verification on the remote backend. Use this option to turn on full verification, which will decrypt the files and examine the insides of each volume, instead of simply verifying the external hash

So “full” does not mean “all” dblocks. It means that a deeper look is taken inside the sample that you set.

It looks like some of this was a compact, described in more detail here. That downloads dblock files with excess wasted space (invisibly to your log, but Duplicati log would show it), would cause an “added” of a dblock and its dindex, and “deleted” of the old ones. You added more than you deleted, which is the idea.

Error during compact forgot a dindex file deletion, getting Missing file error next run. #4129

is possibly what this was. Original title mentioned “403 error”, but reproducible case was FTP, so no 403.
In above issue, Duplicati deleted a file in compact, hit an error, and lost records of deletion on its way out.

You push a button, get a download, upload it somewhere (for example Dropbox), and post a link to it here.

DB Browser for SQLite on a copy of your database is another option. Look at Remotevolume table to see State value for your known-deleted files. You can filter at the top of the Name column, then look at State:

Third option is you might be able to catch the missing file name in About → Show log → Live → Warning.
Set up a tab to that. In another tab press “Verify files” to see if first tab logs problems (then click on them).

log-file=<path> will do the equivalent of the live log above, but sees issues earlier and better-timestamped.

Nice opportunity to take a good look at it. If you can stand larger files, adding log-file-log-level=information would give you log records of uploads, downloads, deletes, and whether or not a compact was being run.

Then unless you saved the original database, DB evidence is destroyed, so no bug report or looking now. The good/bad news is that this comes up often enough that you can maybe do debug steps in the future.

Use the test command with all. I’m not sure how your download speed is. This will download all the files.

1 Like

Thank you for all this. I am going to add the log file level to each backup. I have enough backups to local destinations that I’m sure this will pop up again fairly soon (just last week I had another failure to a local drive, wiped the backup and started over). It’s the remote ones that are problematic due to limited upload speed. Waiting on a fiber connection here!

Sounds like I need to add a log file=path command as well as the debug level for each backup job? I suppose I should create a unique log for each job?

The same one as here? In theory, remote drives are more at risk of problems during file transmission.

If there are multiple on the same system, that would make it easier to figure out what is what, although running them together can probably be pulled apart, especially if jobs run on a regular known schedule.

Same source, different destination. Last week it failed to a USB drive, and Sunday it failed to dropbox.

So I’ve added lines to each job like this:

--log-file=/home/user/software-to-dropbox.log
--log-file-log-level=Information

and saved each configuration after doing so. I started running jobs, but no log files are being created in that path (it’s my home user account). Do I need to restart the duplicati service or something?

Edit - interesting. When I select the live log and choose “Information” I don’t see anything at all, but when I choose “Profiling” I get logs on screen. I don’t understand that?

Edit 2: This may be a long-standing issue. Log-file defined with "--log-file" is not logging errors · Issue #3038 · duplicati/duplicati · GitHub

Are failure symptoms the same as this issue, e.g. “Failed to decrypt”, “missing from the remote”, etc.?

No. It should just go into effect the next time that job is started. Information has more gaps than Profiling, when you peek at some random time, but it should at least say something right at the job start. Example:

2021-10-06 17:50:00 -04 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started

Maybe, but with no additional reports (other than your mention) in 3 years, it would seem an unusual one.
That issue might have had permission problems with /var/log too, unless Duplicati was running as root…

Yes, it was the same failure to decrypt message.

I’ve set up a world-writeable directory called /duplicati and I’ll see if I can get a log file writing there. I’ll play around with log file commands in one job to see if I can get any output at all.

Seems to indicate log-level would be deprecated eventually but wasn’t followed up on, so I’m not sure which options I need maybe. I need a filter=*. I am running 2.0.6.1_beta_2021-05-03.

Replying to myself - either creating a new directory or adding “log-level=Profiling” is what fixed it, I’m getting logs now.

It was followed up on, but that doesn’t matter because you’re not using it – you’re using the newer option.

C:\ProgramData\Duplicati\duplicati-2.0.5.114_canary_2021-03-10>Duplicati.CommandLine.exe help log-level
  --log-level (Enumeration): Log information level
    [DEPRECATED]: Use the log-file-log-level and console-log-level options
    instead
    Specifies the amount of log information to write into the file specified
    by --log-file
    * values: ExplicitOnly, Profiling, Verbose, Retry, Information, DryRun,
    Warning, Error
    * default value: Warning

I think I figured out what happened. I had paused Duplicati last night, and some jobs were scheduled overnight. When I altered the configs and restarted Duplicati, perhaps the jobs had already been queued with the old config files (no logging). Once those queued jobs finished, the logging started working for new jobs.

Everything is being logged at the “information” level now, which hopefully will provide some insight on the next failure.

Really appreciate your help!

1 Like

Well I just got the error.

2021-10-13 13:00:38 -07 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2021-10-13 13:00:48 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2021-10-13 13:00:56 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (3.88 KB)
2021-10-13 13:07:12 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ba2d476c87c0c40cd964846effe17efbd.dblock.zip.aes (49.99 MB)
2021-10-13 13:08:03 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ba2d476c87c0c40cd964846effe17efbd.dblock.zip.aes (49.99 MB)
2021-10-13 13:08:03 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i5de38712aed54f3eaef26c061401bf90.dindex.zip.aes (18.18 KB)
2021-10-13 13:08:05 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i5de38712aed54f3eaef26c061401bf90.dindex.zip.aes (18.18 KB)
2021-10-13 13:08:05 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bb277e007c2cd41e1a32808b02d93544a.dblock.zip.aes (49.99 MB)
2021-10-13 13:08:57 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bb277e007c2cd41e1a32808b02d93544a.dblock.zip.aes (49.99 MB)
2021-10-13 13:08:57 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i9ad2c357be63467fb32e4a9c021f2de0.dindex.zip.aes (18.31 KB)
2021-10-13 13:08:59 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i9ad2c357be63467fb32e4a9c021f2de0.dindex.zip.aes (18.31 KB)
2021-10-13 13:08:59 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b5593d80b930840519c3293354837b522.dblock.zip.aes (49.91 MB)
2021-10-13 13:09:49 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b5593d80b930840519c3293354837b522.dblock.zip.aes (49.91 MB)
2021-10-13 13:09:49 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-id43a83c581e747aeb2213a3586358b6e.dindex.zip.aes (56.79 KB)
2021-10-13 13:09:52 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-id43a83c581e747aeb2213a3586358b6e.dindex.zip.aes (56.79 KB)
2021-10-13 13:09:52 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bf334e9bf054f41478026ddc370708d58.dblock.zip.aes (49.92 MB)
2021-10-13 13:10:43 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bf334e9bf054f41478026ddc370708d58.dblock.zip.aes (49.92 MB)
2021-10-13 13:10:43 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i9db871242d8748e79cd257a241ffa7b4.dindex.zip.aes (238.93 KB)
2021-10-13 13:10:45 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i9db871242d8748e79cd257a241ffa7b4.dindex.zip.aes (238.93 KB)
2021-10-13 13:10:45 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b35f15a1e1e964993b549a4b9727d5b0f.dblock.zip.aes (49.92 MB)
2021-10-13 13:11:37 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b35f15a1e1e964993b549a4b9727d5b0f.dblock.zip.aes (49.92 MB)
2021-10-13 13:11:37 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i5a6893a8c0e748ccb99fb885fad883c2.dindex.zip.aes (84.06 KB)
2021-10-13 13:11:39 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i5a6893a8c0e748ccb99fb885fad883c2.dindex.zip.aes (84.06 KB)
2021-10-13 13:13:55 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b646b3d3162aa47bdb84627b8f82f6121.dblock.zip.aes (37.70 MB)
2021-10-13 13:14:36 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b646b3d3162aa47bdb84627b8f82f6121.dblock.zip.aes (37.70 MB)
2021-10-13 13:14:36 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i70de9548a3a34069a0f88635e8741595.dindex.zip.aes (116.89 KB)
2021-10-13 13:14:39 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i70de9548a3a34069a0f88635e8741595.dindex.zip.aes (116.89 KB)
2021-10-13 13:14:39 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20211013T200038Z.dlist.zip.aes (8.83 MB)
2021-10-13 13:14:50 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20211013T200038Z.dlist.zip.aes (8.83 MB)
2021-10-13 13:14:50 -07 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteRemoteFileset]: Deleting 1 remote fileset(s) ...
2021-10-13 13:14:55 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20211002T191007Z.dlist.zip.aes (8.60 MB)
2021-10-13 13:14:56 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20211002T191007Z.dlist.zip.aes (8.60 MB)
2021-10-13 13:14:56 -07 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: Deleted 1 remote fileset(s)
2021-10-13 13:15:03 -07 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting because there are 56.37 MB in small volumes and the volume size is 50.00 MB
2021-10-13 13:15:04 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b272bd097467d4eb18bc95497f8ffe78e.dblock.zip.aes (50.07 MB)
2021-10-13 13:15:08 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b272bd097467d4eb18bc95497f8ffe78e.dblock.zip.aes (50.07 MB)
2021-10-13 13:15:08 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b486cce91733b4472b59ff2300980d914.dblock.zip.aes (49.98 MB)
2021-10-13 13:15:13 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b486cce91733b4472b59ff2300980d914.dblock.zip.aes (49.98 MB)
2021-10-13 13:15:13 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b4e53f5674ae34e8999b449bc5a237f13.dblock.zip.aes (49.93 MB)
2021-10-13 13:15:18 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b4e53f5674ae34e8999b449bc5a237f13.dblock.zip.aes (49.93 MB)
2021-10-13 13:15:18 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes (46.50 MB)
2021-10-13 13:15:22 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes (46.50 MB)
2021-10-13 13:15:32 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes (46.50 MB)
2021-10-13 13:15:35 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes (46.50 MB)
2021-10-13 13:15:45 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes (46.50 MB)
2021-10-13 13:15:49 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes (46.50 MB)
2021-10-13 13:15:59 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes (46.50 MB)
2021-10-13 13:16:03 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes (46.50 MB)
2021-10-13 13:16:13 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes (46.50 MB)
2021-10-13 13:16:17 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Failed: duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes (46.50 MB)
2021-10-13 13:16:17 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b87b40a9fe7d04c269321d78b48363e54.dblock.zip.aes (13.61 MB)
2021-10-13 13:16:18 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b87b40a9fe7d04c269321d78b48363e54.dblock.zip.aes (13.61 MB)
2021-10-13 13:16:18 -07 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.Security.Cryptography.CryptographicException: Failed to decrypt data (invalid passphrase?): Message has been altered, do not trust content ---> SharpAESCrypt.SharpAESCrypt+HashMismatchException: Message has been altered, do not trust content
  at SharpAESCrypt.SharpAESCrypt.Read (System.Byte[] buffer, System.Int32 offset, System.Int32 count) [0x00405] in <45c43198e80545aeabaa9b98839b4c7d>:0
  at Duplicati.Library.Utility.Utility.CopyStream (System.IO.Stream source, System.IO.Stream target, System.Boolean tryRewindSource, System.Byte[] buf) [0x0003d] in <5d5d65b1e6e4427684abed657dc45730>:0
  at Duplicati.Library.Encryption.EncryptionBase.Decrypt (System.IO.Stream input, System.IO.Stream output) [0x00008] in <c36ef9933c0448c2a3eb2c5c27b6f69b>:0
   --- End of inner exception stack trace ---
  at Duplicati.Library.Main.AsyncDownloader+AsyncDownloaderEnumerator+AsyncDownloadedFile.get_TempFile () [0x00008] in <34c05650075c4a0583e29e116862f000>:0
  at Duplicati.Library.Main.Operation.CompactHandler.DoCompact (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Boolean hasVerifiedBackend, System.Data.IDbTransaction& transaction, Duplicati.Library.Main.BackendManager sharedBackend) [0x00259] in <34c05650075c4a0583e29e116862f000>:0
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBackend, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x003a2] in <34c05650075c4a0583e29e116862f000>:0
  at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <34c05650075c4a0583e29e116862f000>:0
  at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x00da9] in <34c05650075c4a0583e29e116862f000>:0
2021-10-13 13:16:19 -07 - [Information-Duplicati.Library.Modules.Builtin.SendMail-SendMailComplete]: Email sent successfully using server: smtp://mail.REMOVED.com:587/?starttls=always

Looks like it tried to GET the file “duplicati-b04ff6a788a4940f1b1714746858746cf.dblock.zip.aes” five times and failed. I have downloaded that file from Dropbox and saved it on my desktop for analysis, so I’ll go back to your earlier messages and see if I can figure out what the next steps are. I won’t re-run the operation or run any repair operations in the meantime.

If I’m understanding this log correctly, Duplicati was trying to do a compact operation. It was going to download 4 files to compact them, and it wasn’t able to download one of them. Which means the compact operation couldn’t continue.

If that was the problem, I suppose I can try increasing the global http timeout and the number of retries.

number-of-retries

--number-of-retries = 5
If an upload or download fails, Duplicati will retry a number of times before failing. Use this to handle unstable network connections better.

While it will help with flaky network connections, a problem that’s actually in the file will keep on coming.

If you prefer a GUI tool over Duplicati’s installed CLI SharpAESCrypt.exe, try decrypting it with AES Crypt.
If it won’t decrypt (test a working file to test the procedure), it’s useless and its loss will affect some files.
The AFFECTED command will say which ones, if you tell it a dblock file name that you’re interested in…
You can run that in GUI Commandline. Disaster Recovery gives information on recovering from bad files. One amendment is that it can’t always tell what files are bad, but deleting bad files will make it very clear.

Looking at the bad file’s date will show roughly how long it’s been around. Its size may be a clue, and DB information is even better if you want to go there (provided DB hasn’t been recreated and lost its history).

1 Like

OK that’s helpful! I understand. AES Crypt indeed says the file is broken (and I was able to decrypt a different dblock file as you suggested).

AFFECTED says one file is affected. Both the affected jpg file and the dblock file were created the same day, July 21, 2021.

I’ll look into disaster recovery and see how to move forward.

Update: Oddly, both list-broken-files and purge-broken-files return code 0 and say “no files found to be missing, and no filesets recorded as broken”. I don’t know how to recover from this because Duplicati isn’t aware of anything being wrong, yet when I try to restore the jpg file I’m getting the same failure/retrying issue because that one dblock file is corrupt.

Update 2: I renamed the dblock file (added “badfile” to the end of the filename) and re-ran list-broken-files. This time Duplicati listed the jpg file as broken in all 10 versions, and a purge worked on ten backup sets.

My understanding: a dblock file was broken at some point, probably at the time it was originally uploaded because Dropbox shows no revisions for that file. Duplicati wasn’t aware of the problem until it tried to run a compact that happened to include that dblock file, which is when I got the “Failed to decrypt data” message.

In order to fix, I had to rename the “bad” dblock file so that Duplicati could understand it was broken, then purge the files contained in the bad dblock file from all backup sets, and then repair the database (duplicati complained about the extra file after I had renamed it).

My questions are:

  1. How can Duplicati be made aware of bad dblock files sooner? Probably scanning more dblock files at the end of the backup; I know there’s a percentage value of the backup I can increase. I can also use the “test” command which I need to learn more about.

  2. When the dblock was found to be “altered” (“corrupted” is a more typical term), why didn’t Duplicati tell me which dblock file was corrupted in the error message? The stack trace error doesn’t appear to show the filename.

My ideal scenario here would be something like:

“Fatal error: dblock file FILENAME is corrupt and unreadable; this error was discovered during a COMPACT operation. Here is a list of source files that are impacted by this problem: (list). Press the “purge” button below to remove those impacted files from affected backup sets and delete the corrupted dblock file. Then it is suggested that you immediately re-run this backup job to ensure all source files are contained in at least one recent backup set.”

From the log it seems that you were probably running compact operation (oh, you said it later)? If that is aborted, things will break, as I’ve posted here: FTP delete not atomic / verified / transactional (?)

Might be related, might not. But based on your description, it seems pretty familiar to me. Actually I should fix the topic, the problem has nothing to do with FTP. It’s general flaw. → Let’s see if I can update it.

With most of encrypted & replicated transports / storage, true corruption is very unlikely. But sure, it can happen. Yet the problems I’ve seeing are systemic issues.

If you want to know about corrupted blocks sooner, I would recommend running test. But technically when compact messes up, it’s not corruption / back-end issue at all.

Just as warning. Test with local database, doesn’t always reveal problems. You’ll need to do clean restore, and it might reveal many problems which test didn’t reveal.

As example broken file might be something like compaction running non-transactionally, it upload some file partially and fails. There should be cleanup for such process, just like file system journal recovery, but that never happens. I don’t know why. But it seems like systemic problem.

With this light analysis I can’t say it would be your case. But I’ve clearly reported those two cases and confirmed that those happen over and over again. Afaik, current version doesn’t remedy those problems.

1 Like

All the things you name. It can often detect corrupted files instantly with a directory listing. Other times the only way to detect file damage is to download files and see if the contents match the records of contents.

It does tell you its name, at least sometimes. A stack trace is an uncontrolled thing, but sometimes handy.
Combine logs in GUI #1152 is part of current logging problem. There are too many places for logs to exist.
In addition to the per-job logs, there is the server log About → Show log → Stored, and its → Live version, which is a good way to see names if one is firmly expecting them. Or set up log-file=<path>, and I think it names names, but I haven’t tested. What I did test was an intentionally corrupted file and Verify files.
Job log was made, and says:

  • 2021-10-18 13:52:52 -04 - [Error-Duplicati.Library.Main.Operation.TestHandler-RemoteFileProcessingFailed]: Failed to process file duplicati-b2f339aa1819447fa8b1806d851d20e1b.dblock.zip.aes

About → Show log → Live → Warning shows:

Oct 18, 2021 2:19 PM: Failed to process file duplicati-b2f339aa1819447fa8b1806d851d20e1b.dblock.zip.aes

Commandline test shows:

Failed to process file duplicati-b2f339aa1819447fa8b1806d851d20e1b.dblock.zip.aes => Failed to decrypt data (invalid passphrase?): Message has been altered, do not trust content
  Downloading file (813 bytes) ...
duplicati-b2f339aa1819447fa8b1806d851d20e1b.dblock.zip.aes: 1 errors
	Error: Failed to decrypt data (invalid passphrase?): Message has been altered, do not trust content

For all these cases that work, unfortunately I think backup failures tend to leave no job log, and the server log doesn’t have the information you’d like. The above issue cites “technical reasons” for having two logs.

Great wish. I wish there were lots of volunteer developers to grant it to us, but they are exceedingly rare, and lack of developers is what limits the development pace. Volunteers encouraged in all other areas as well, even if it’s just knowing Duplicati enough to get fairly good at answering the more routine questions.

1 Like

I understand. I’ve tried to help a few times on this forum myself when I can, and hopefully taking the time to do some detective work with a lot of help from people like you helps others.

This experience taught me a bit more about doing logging in Duplicati. Maybe the following helps someone.

I have created the directory /var/log/duplicati/ and I’ve set global settings to:

--console-log-level=Profiling
--console-log-filter=-*.Database.*
--log-file-log-level=Information
--log-file-log-filter=+Retry*;+*RetentionPolicy*;

Then within each backup job, I added:

--log-file=/var/log/duplicati/JOBNAME.log

This gives me a really clean view of what’s happening with each job, and I can tail the logfiles while the job is running for testing.

I also created /etc/logrotate.d/duplicati:

/var/log/duplicati/*.log {
        weekly
        missingok
        rotate 12
        delaycompress
        notifempty
        create 640 root adm
        sharedscripts
}

Hopefully this will help track issues as they come up. Despite my lack of deep familiarity with Duplicati, I’m probably a good test case because I have so many jobs running with a mix of sources and destinations (CIFS/NFS, SSHFS to local NAS, Dropbox, and Backblaze B2).

I understand a “test” (or “verify”) is really the only option. I can run that on my local backups and dropbox where there are no download limits, but BackBlaze B2 does charge for downloads after the first 10GB per day. I had set my backup-test-percentage to 5 and hit my B2 limit very quickly, so now I’ve set globally:

--backup-test-percentage=2

and the overrode that for B2 jobs by setting it to zero, which does the standard test of one dblock, dlist, and dindex file.

I’ll run this setup for awhile and keep an eye on things for the next time an error pops up.

1 Like

Diversity plus logs and some interest is a terrific combination for catching things and maybe getting clues.
I run a profiling log all the time just in case something “interesting” happens, but my variety is quite lacking. There’s a bit more one can do, but thanks for the great start.

1 Like