Strange… these 8 files look odd when I browse them in B2. They show 2 versions, the newer one being 0 bytes and hidden. Any ideas? (I don’t know why there are 2 versions, I have B2 bucket configured to only retain the latest version.)
As far as I know Duplicati doesn’t do anything involving making files hidden so I would guess that the hidden / 0 byte ones are something specific to how B2 does file processing.
Of course it could be that incoming files to B2 are hidden by default and start out at 0 bytes until they’re done being uploaded so maybe those are interrupted uploads?
Am I correct in assuming the manual deletion of both the 0 b and 15.x MB files resolved your initial problem?
Yes. The warnings stopped once I deleted those files.
I am guessing it was some quirk in B2… Duplicati told B2 to delete the files and it didn’t quite do it right for some reason. Not sure.
Found out the root cause. I moved data from S3 to B2 using rclone. I would do an initial (somewhat time consuming) sync using rclone, then temporarily disable backup jobs while I do a final rclone sync. Once that completes I edit the backup job to target B2 and then resume normal backups.
rclone behavior with B2 when synchronizing “deletions” is to not actually delete the file, but rather hide it. This confused Duplicati.
fortunately rclone has an option to do actual deletions when sync to B2:
--b2-hard-delete Permanently delete files on remote removal, otherwise hide files.
Thought I’d follow up in case anyone else had this issue.
I was able to restore a test file successfully from a batch with the the 'ignore remote files listed as deleted" error; so hopefully, “success” can be taken as meaning success. But there was the same warning on the restore so not a confidence booster!
I then thought I would try to delete manually the two files listed as deleted from the backup location, but they did not exist. I had assumed they existed at the remote location but the local database thought they were deleted, but not it seems.
So, I have no idea what is going on bearing in mind that I had run a database verify and a repair, but maybe I need to recreate the database.
Sorry, haven’t had time to change the retention poloicies yet.
The only thing I can think of is that somehow the API all Duplicati is using to get the file list is cached or otherwise stale vs. what is provided to you as a logged in user.
I believe there is a job level log called “Remote” which includes the actual results Deuplicai is returned from listing files on a remote system.
It’s not pretty, but perhaps you can find the offending file in that log?