Hash mismatch on non-existent file

Interesting – and surprising to me. I don’t know how the temporary file tracking is done, but I have 97 leftover files now without visible problem other than space waste. See Duplicati temp dup-xxxx files not being deleted. There are enough jobs sharing this system that it seems like there statistically ought to be blended leftovers.

On the other hand, your finding is interesting and maybe someone will chase it. I’m glad you found a way out.

EDIT: We looked over the job 2 database at the RemoteVolume row for the ghost dblock file, which was there. Hearing that it belongs to job 1, I wonder what job 1’s RemoteVolume table thinks about that file? Collisions in filename are theoretically possible, I suppose, but is there any common history at all between these two jobs? Collisions in temp file names may also be theoretically possible, but as you noted, the filename changes a lot. One common point (I assume) is that both jobs share a Duplicati Server. Maybe some cross-connect is there, although your separating temp files may show otherwise. Basically this is confusing my limited understanding.

Although it may be impractical, if you can come up with reproducible steps, a GitHub issue can track that well.

1 Like