Duplicati.Library.Interface.UserInformationException: Found remote files reported as duplicates, either the backend module is broken or you need to manually remove the extra copies


#1

Getting the following error with one of my backups to Google Drive. Should I just find the file in the Google Drive store and simply delete it?

Fatal error
Duplicati.Library.Interface.UserInformationException: Found remote files reported as duplicates, either the backend module is broken or you need to manually remove the extra copies.
The following files were found multiple times: duplicati-i0d8fe5d412d4436d85d3c9a136445a1e.dindex.7z.aes
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)

Exception of type 'System.OutOfMemoryException' was thrown
#2

If you look at the Google Drive folder do you see multiple copies of duplicati-i0d8fe5d412d4436d85d3c9a136445a1e.dindex.7z.aes? If so, do they have the same dates, sizes, permissions, etc?

I’m not really sure how the duplicate would have been made as Duplicati doesn’t re-upload existing files - as far as I know if it needs to send data back up that was already at the destination it sends it with a new file name, verifies it arrived, then deletes the old named file.

Note I’ve only heard of this issue once or twice before - and one time was due to an rsync from one destination to another causing the duplicates. I don’t think either of those cases involved Google Drive but just to be sure, you weren’t doing anything like that were you?

(By the way, I edited your post by adding “~~~” before and after the error to help it stand out more.)