I see this in my job log. Should I manually delete these files? Or is there a way to have Duplicati try to delete them again?
ignoring remote file listed as Deleted: duplicati-20180203T030000Z.dlist.zip.aes,
ignoring remote file listed as Deleted: duplicati-20180204T030000Z.dlist.zip.aes,
ignoring remote file listed as Deleted: duplicati-20180427T030000Z.dlist.zip.aes,
ignoring remote file listed as Deleted: duplicati-20180427T070000Z.dlist.zip.aes,
ignoring remote file listed as Deleted: duplicati-20180427T110000Z.dlist.zip.aes,
ignoring remote file listed as Deleted: duplicati-20180427T150000Z.dlist.zip.aes,
ignoring remote file listed as Deleted: duplicati-20180427T230000Z.dlist.zip.aes,
ignoring remote file listed as Deleted: duplicati-20180428T030000Z.dlist.zip.aes,
Strange… these 8 files look odd when I browse them in B2. They show 2 versions, the newer one being 0 bytes and hidden. Any ideas? (I don’t know why there are 2 versions, I have B2 bucket configured to only retain the latest version.)
I ended up just deleting the files using the B2 console…
As far as I know Duplicati doesn’t do anything involving making files hidden so I would guess that the hidden / 0 byte ones are something specific to how B2 does file processing.
Of course it could be that incoming files to B2 are hidden by default and start out at 0 bytes until they’re done being uploaded so maybe those are interrupted uploads?
Am I correct in assuming the manual deletion of both the 0 b and 15.x MB files resolved your initial problem?
Yes. The warnings stopped once I deleted those files.
I am guessing it was some quirk in B2… Duplicati told B2 to delete the files and it didn’t quite do it right for some reason. Not sure.
Glad it’s working (I’m flagging your post as the “Solution”) but it’s good to know that it’s documented here.
If it happens again we can maybe decide Duplicati needs a method to resolve it other than have users manually delete stuff from their destination.
Found out the root cause. I moved data from S3 to B2 using rclone. I would do an initial (somewhat time consuming) sync using rclone, then temporarily disable backup jobs while I do a final rclone sync. Once that completes I edit the backup job to target B2 and then resume normal backups.
rclone behavior with B2 when synchronizing “deletions” is to not actually delete the file, but rather hide it. This confused Duplicati.
fortunately rclone has an option to do actual deletions when sync to B2:
Permanently delete files on remote removal, otherwise hide files.
Thought I’d follow up in case anyone else had this issue.
Great work! I went ahead and flagged your post as the solution in case other’s have this issue.
I have had a similar problem with OneDrive, although I don’t know the reason:
ignoring remote file listed as Deleted: duplicati-ie09222086d444eb5ad1e32e77dafc8fc.dindex.zip.aes
I deleted it manually from OneDrive, and then ran “verify” and backup - I use the gui - and now I get
ignoring remote file listed as Deleted (2 of them):
The backup is logged as a “success”
Anything I can do to get off this merry-go-round?
Just to confirm that, pretend you’re going to do a restore and make sure the "success"ful backup appears in the restore list.
I suspect it will as the error you’re running into is part of the post-backup cleanup process. So your backups are probably succeeding just fine.
Are you using a retention policy? If so, does the error go away if you (temporarily) set it to “Keep all versions”?
I was able to restore a test file successfully from a batch with the the 'ignore remote files listed as deleted" error; so hopefully, “success” can be taken as meaning success. But there was the same warning on the restore so not a confidence booster!
I then thought I would try to delete manually the two files listed as deleted from the backup location, but they did not exist. I had assumed they existed at the remote location but the local database thought they were deleted, but not it seems.
So, I have no idea what is going on bearing in mind that I had run a database verify and a repair, but maybe I need to recreate the database.
Sorry, haven’t had time to change the retention poloicies yet.
The only thing I can think of is that somehow the API all Duplicati is using to get the file list is cached or otherwise stale vs. what is provided to you as a logged in user.
I believe there is a job level log called “Remote” which includes the actual results Deuplicai is returned from listing files on a remote system.
It’s not pretty, but perhaps you can find the offending file in that log?
I recreated the database, and the problem has gone away. Also updated to latest “experimental” release".
Thanks for your help