The way it looks now if you uninstall the DEB, it WILL delete the job configuration and the backup job indexes. I’m not sure about upgrades. So I’d be very careful. This is why I’m looking for splitting file locations. I’m waiting for Server needs --parameters-file argument · Issue #2928 · duplicati/duplicati · GitHub before I start the package fixes because it is key to only have to do the package refactoring once.
I did a fresh install of the Canary Version. Same result.
Then I created a new backup job which finished with a warning: remote file duplicati-20171230T143742Z.dlist.zip.aes is listed as Uploaded with size 0 but should be 148765, please verify the sha256 hash
I did then a “verify files” which ended with no errors or warnings, so I assume everything is fine.
Seems for some reason the previous job was messed up. No idea why and how.
Note that the “uploaded vs should be” file size errors are often related to a delay in how quickly the destination can report on just-uploaded files (or less often a failed upload).
If you get an error at the start of a job it’s likely a failed upload, but if it only happens at the end - and only for recently uploaded content - then it’s likely the destination delay.
Oh, right - you asked about that and I forgot to answer.
I believe in your case “item1” is the ID of the currently running task, and “item2” is the ID of the next task that will run as soon as “item1” is done.
While not showing in your case, I think it can also be a set of an ID and a time that it’s scheduled to start.