Found 4 files that are missing from the remote storage

4 files went missing from my backup storage, and I have a theory about why. I hope that a developer can check out my theory to see if this is caused by a bug in Duplicati’s “Compact” process. I really doubt that they were deleted by me or an antivirus. Here are some clues:

May 14 7:18 PM - a backup ran successfully

May 15 7:08 PM - a LIST command was logged in the Remote job log, showing a large list of files, including the 4 files that later went missing

May 15 7:20 PM - a compact job ran

Compacting because there are 109.48 MB in small volumes and the volume size is 100.00 MB

May 15 7:23 PM - the system log shows an exception:

Failed while executing “Backup” with id: 2
There is not enough space on the disk.

May 16 7:18 PM

  • a LIST command in the Remote job log shows a large list of files, and is missing the 4 files
  • multiple lines in the General job log say files were deleted:

removing file listed as Deleting: duplicati-ba32f53aa7b894883bef9fe4c3ebfaaeb.dblock.zip.aes
removing file listed as Deleting: duplicati-b5ad89a11bd524948999651d74f22613d.dblock.zip.aes
removing file listed as Deleting: duplicati-ba67c4d9e011d457e86acee4a504a359f.dblock.zip.aes
removing file listed as Deleting: duplicati-bfb278be84c2b4a74afb95e1d6fddc7cf.dblock.zip.aes
removing file listed as Temporary: duplicati-bcf8aea1183544f9ca11db39c2d688449.dblock.zip.aes
removing file listed as Temporary: duplicati-i3cbffe1917004f15b20849dc3b70bd7d.dindex.zip.aes

-multiple lines in the General job log say 4 files are missing:

Missing file: duplicati-iaaa9d07845504fceadb121a46d34c767.dindex.zip.aes
Missing file: duplicati-ib09d8d58fdaf4579ad1a2aca74a7ef09.dindex.zip.aes
Missing file: duplicati-ic8166d61659e423fbf529269ec4c5ccb.dindex.zip.aes
Missing file: duplicati-ief5351d6419b4ec185131a49fbd0ed77.dindex.zip.aes

-the System log shows an error:

Duplicati.Library.Interface.UserInformationException: Found 4 files that are missing from the remote storage, please run repair


So, I’m guessing that the Compact job failed due to an error about Out Of Space, and that somehow led to files going missing on the remote storage. When I ran the “Affected” command on those 4 files, they all said that no files were affected, which suggests to me that the Compact intended to delete those files since they were no longer needed, but when the crash happened, Duplicati forgot about that and still believed that those 4 files were needed and missing.

Another related question is Why did I get an out-of-space error? The remote storage has plenty of space, my dblock-size is 100MB, and my temp RAM disk is 500MB. How big does my temp space need to be?

I clicked the “Repair” button in the GUI and it regenerated the 4 missing dindex files and uploaded them to the backup location. I downloaded and decrypted one of those files, extracted it, and saw that it only included a single “manifest” file and nothing else, which I believe means the dindex file isn’t being used for anything and supports my original guess that the “Compact” process rightly intended to delete this obsolete file, but somehow, Duplicati didn’t remember that the file was intentionally deleted and is marking it as “missing”.

I then ran a Verify, which passed, and a backup job, which is currently running without errors.

So, do the developers agree that there’s a bug somewhere with Duplicati not remembering that those files were intentionally deleted, and the files should not have been marked as missing?

I think there is a failure scenario that allows this situation to arise, but don’t know if it’s a bug (something wrong in Duplicati), unhandled scenario (Duplicati was never “told” how to handle a particular scenario), or something Duplicati CAN’T handle (such as power failure while writing to the database).

Thanks to your input we may be able to replicate what happened in your case simply by forcing an “out of space” error during a compact process…


During backups using default settings you need at least 4x block-size (Upload volume size) as Duplicati will “build ahead” up to 4 dblock files for the upload queue. So in your case 400MB of temp space would be needed just for holding the pending uploads.

However, during a compact you may need more than that because the compact is downloading as many sparsely used dblock files as needed to be able to make a single fully used file.

It’s not quite as simple as this (hopefully @kenkendk will correct me if I’m wrong) , but in a “worst case” scenario if you have 5 dblock files each with only 20% useful files (I think that’s the default threshold) then with your 100MB dblock size your temp storage would need:

  • 500MB of space to store the 5 downloaded dblock files
  • PLUS space needed for the 20% content of those dblocks to be uncompressed (actual space needed would vary depending on compression levels compressability of the content)
  • PLUS space for the new fully utilized dblock to be created

Of course it’s unlikely all that space will be needed all at once as with each step some files from the previous step are no longer used and could be deleted. Plus depending on how the code is actually written it may be downloading / processing just one dblock at a time and adding to the new compressed file rather than pulling them all down and building the new compressed file in a single create.

or something Duplicati CAN’T handle (such as power failure while writing to the database).

I have never used SQLite but I assume it works like MSSQL; there’s a way to avoid this problem with a small amount of journaling.

  1. Write a row to the DB that says “I’m about to create/delete/modify a file in the file system or write a major edit to the DB.”
  2. Then, create/delete/edit the file or make the DB edit.
  3. Last, delete the row from step 1.

If you have a power failure or crash at any step, then the next time Duplicati starts up, it would see that it was in the process of doing action X, and can safely resume that task (or cleanly abort it) without leaving anything in a half-done state, aka corrupted. Regardless of which step it fails on, Duplicati will always be able to immediately return itself to a known-good state and there will never be a mismatch between what the database thinks and what the filesystem thinks. It’s an important feature for a super reliable data backup program.

Thanks for the info about the temp folder. I didn’t see that anywhere else so it may be useful to add to the manual. I ended up removing my RAM disk because of the extra complexity, so now I have plenty of temp space. I’ll just endure any slowness until Duplicati eventually does all its work in RAM automagically.

Thank you. This exact scenario happened to me a second time, so it seems repeatable: out of disk space error during a compact, then a missing files error during the next backup. I hope it helps!

I’m pretty sure that’s already being done (though I haven’t double checked the code) - my guess is there’s something else not being handled correctly. But thanks for the suggestion - maybe it can get implemented the way you describe if I’m wrong and it’s not already like that.

I haven’t had a chance to test this out yet, but thanks for letting us know it happened again!

I have a similar issue, however not related to limited disk space, but a result of using compact. See this issue for example. This has also been reported as an issue on Github.

Found 1 files that are missing from the remote storage, please run repair #1706 has more comments.

I won’t repeat all the linked forum post, but there are ways there to see if you hit linked issue which looks like error handling during a compact caused Duplicati to roll back its DB transaction, thereby forgetting it deleted some dindex files (delete verifiable through logs and maybe recyle/trash bins).

Forum post reference was to Continually needing repair. I posted clues there, and clues here may be:

(and I’d note that cited missing files were dindex files, which was also the case with my GitHub issue)