Delete dlist file not found & exclude ~$ temporary files

Hello,

I am running “2.0.5.1_beta_2020-01-18”.

I would be grateful for your assistance with 2 issues please:

  1. I have 2 backup sets configured and they are both backing up to the same Google Drive account, but using different destination paths. I only have this issue with one of the backup sets.
    From time to time, the problematic backup set flags a warning stating:

"2020-04-14 19:01:07 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20200414T180001Z.dlist.zip.aes ()", "2020-04-14 19:01:08 +01 - [Information-Duplicati.Library.Main.BackendManager-DeleteRemoteFileFailed]: Delete operation failed for duplicati-20200414T180001Z.dlist.zip.aes with FileNotFound, listing contents" ], "Warnings": [ "2020-04-14 19:01:08 +01 - [Warning-Duplicati.Library.Main.BackendManager-DeleteRemoteFileFailed]: Listing confirms file duplicati-20200414T180001Z.dlist.zip.aes was not deleted" ],

The complete log is here: { "DeletedFiles": 0, "DeletedFolders": 0, "ModifiedFiles": 0, "Exami - Pastebin.com
Thinking that this was a database inconsistency with what is on the remote Google Drive storage, I first tried a repair and when the warning re-appeared again I then did a recreate. However, the warning continues to re-appear from time to time. I would be grateful if you could help please?

  1. Secondly, I wish to exclude temporary (Microsoft Office) files that begin with ~$. For example, “~$Nameofdocument.xlsx”. I have tried the filter “Exclude files whose names contain” and entered “~$” (without quotes) in the adjacent text box via the GUI - see below screenshot.2020-05-03_19h24_13

What I am trying to achieve here is to prevent Duplicati from backing up these temporary files from open Microsoft Office documents. ie. If I have an open Microsoft Office document at the time, Duplicati will flag a warning stating that it was unable to backup the “~$Nameofdocument.xlsx” file because it is open. As this file is simply a temporary lock file, and has no value, I thought I would simply exclude these files from being backed up by Duplicati. However, despite inputting the above filter, I still receive warnings from Duplicati that it was unable to backup the “~$Nameofdocument.xlsx” file because it was open. Therefore, this filter does not appear to work.

I would be grateful for any assistance you can provide please.
Thank you very much.

Hello,

I can’t help you with #1 - I don’t use Google Drive myself and don’t know if those occasional warnings are “normal.”

#2 - Duplicati web UI is a bit buggy when entering filters using those dropdowns. Instead try clicking the three dot menu to the right of the filter area and use Edit as Text. Can you show me what the raw text is for your current filters?

I have this:
-[.*~$[^\\]*]

Thank you.

$ has special meaning in regex, so you’ll need to escape it. Change the filter to:

-[.*~\$[^\\]*]

This is the first report that I can find in the forum or GitHub issues. My own only not-quite-like-its were:

2020-03-12 18:01:49 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: Starting - RemoteOperationDelete
2020-03-12 18:01:49 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20200312T215943Z.dlist.zip.aes ()
2020-03-12 18:01:50 -04 - [Information-Duplicati.Library.Main.BackendManager-DeleteRemoteFileFailed]: Delete operation failed for duplicati-20200312T215943Z.dlist.zip.aes with FileNotFound, listing contents
2020-03-12 18:01:51 -04 - [Information-Duplicati.Library.Main.BackendManager-DeleteRemoteFileSuccess]: Listing indicates file duplicati-20200312T215943Z.dlist.zip.aes was deleted correctly
2020-03-12 18:01:51 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: RemoteOperationDelete took 0:00:00:02.155
2020-03-12 18:01:51 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: Starting - RemoteOperationNothing
2020-03-12 18:01:51 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: RemoteOperationNothing took 0:00:00:00.000
2020-03-12 18:01:51 -04 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: Deleted 1 remote fileset(s)
2020-03-20 08:01:17 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: Starting - RemoteOperationDelete
2020-03-20 08:01:17 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20200320T115935Z.dlist.zip.aes ()
2020-03-20 08:01:18 -04 - [Information-Duplicati.Library.Main.BackendManager-DeleteRemoteFileFailed]: Delete operation failed for duplicati-20200320T115935Z.dlist.zip.aes with FileNotFound, listing contents
2020-03-20 08:01:18 -04 - [Information-Duplicati.Library.Main.BackendManager-DeleteRemoteFileSuccess]: Listing indicates file duplicati-20200320T115935Z.dlist.zip.aes was deleted correctly
2020-03-20 08:01:18 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: RemoteOperationDelete took 0:00:00:00.700
2020-03-20 08:01:18 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: Starting - RemoteOperationNothing
2020-03-20 08:01:18 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: RemoteOperationNothing took 0:00:00:00.000
2020-03-20 08:01:18 -04 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: Deleted 1 remote fileset(s)

Both are reasonable endings, as the files failed initial upload. Maybe this was the day when Google Drive was failing every initial upload, then working on the retry – presumably some Google problem.

A failed upload of this sort of file (with dlist in name) gets renamed with timestamp 1 second later.
Original gets a delete request. If the file never actually uploaded, delete fails but everything is OK…

Your log doesn’t collect enough detail. If you think this might happen again, please set up a –log-file Advanced option with –log-file-log-level=Retry to try to get a better look at what might be happening.

My initial failure was below, and I have no idea why Google was telling me “Forbidden” momentarily:

2020-03-20 08:00:53 -04 - [Retry-Duplicati.Library.Main.Operation.Backup.BackendUploader-RetryPut]: Operation Put with file duplicati-20200320T115935Z.dlist.zip.aes attempt 1 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
   at Duplicati.Library.Utility.AsyncHttpRequest.AsyncWrapper.GetResponseOrStream()
   at Duplicati.Library.Utility.AsyncHttpRequest.GetResponse()
   at Duplicati.Library.JSONWebHelper.GetResponse(AsyncHttpRequest req, Object requestdata)
   at Duplicati.Library.JSONWebHelper.ReadJSONResponse[T](AsyncHttpRequest req, Object requestdata)
   at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.GetFolderId(String path, Boolean autocreate)
   at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.get_CurrentFolderId()
   at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.<List>d__17.MoveNext()
   at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.<PutAsync>d__15.MoveNext()

Here were some other (403) Forbidden which generally failed only the first attempt, then worked:

2020-03-14 07:00:33 -04 - [Retry-Duplicati.Library.Main.BackendManager-RetryList]: Operation List with file  attempt 1 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
2020-03-14 07:01:22 -04 - [Retry-Duplicati.Library.Main.BackendManager-RetryDelete]: Operation Delete with file duplicati-i04cbd4713167488ab244074d313bc4c4.dindex.zip.aes attempt 1 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
2020-03-20 08:00:51 -04 - [Retry-Duplicati.Library.Main.Operation.Backup.BackendUploader-RetryPut]: Operation Put with file duplicati-bfab84d062ff84590b4510e89127de05f.dblock.zip.aes attempt 1 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
2020-03-20 08:00:53 -04 - [Retry-Duplicati.Library.Main.Operation.Backup.BackendUploader-RetryPut]: Operation Put with file duplicati-20200320T115935Z.dlist.zip.aes attempt 1 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
2020-03-20 08:01:02 -04 - [Retry-Duplicati.Library.Main.Operation.Backup.BackendUploader-RetryPut]: Operation Put with file duplicati-b683056732b2147349992e6c71f1676eb.dblock.zip.aes attempt 2 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
2020-03-20 08:01:19 -04 - [Retry-Duplicati.Library.Main.BackendManager-RetryList]: Operation List with file  attempt 1 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
2020-03-21 08:00:51 -04 - [Retry-Duplicati.Library.Main.BackendManager-RetryDelete]: Operation Delete with file duplicati-ica2527e51776408ea8e515613ba06104.dindex.zip.aes attempt 1 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
2020-03-21 08:01:04 -04 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-b0db828d8bedb44c4a65c067442f25aec.dblock.zip.aes attempt 1 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
2020-03-21 08:01:23 -04 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-b297f2333d2cc4fe5836aa117934e6723.dblock.zip.aes attempt 1 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
2020-03-21 08:01:44 -04 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-b2148ad53e84b47cfbd3158edebcbeda6.dblock.zip.aes attempt 1 of 10 failed with message: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.

Thank you @drwtsn32 that has worked.

Thank you @ts678; I have set a log file with log level Retry and will leave it a month or so to see if it re-occurs.

@ts678; I didn’t have to wait very long before the warning re-appeared!
From looking at the log file, it would appear that my issue is pretty much the same as what you have experienced with Google Drive and fits your explanation (what you explained helped me understand the log file - thank you!).
What is also interesting from the log file is that I am experiencing 403 forbidden errors on a regular basis, which retry successfully, but because the backup doesn’t require any delete operations it doesn’t flag any warnings.
Here is my log file: Desktop.zip (3.6 KB) - the log contains other days in it and the backup warnings were flagged for today’s backup: 2020-05-13.

I recall that Google Drive has limitations (like other free storage sharing providers such as OneDrive) and I wonder if this is the cause of the 403 forbidden errors. I quick google brings up this Stackoverflow question.

Thank you.

I note that other users are also receiving these warnings from Google Drive. So, I just want to provide my conclusion to this issue. As stated in my last reply:

I have previously had a bad experience using OneDrive and the Cloudbacko backup software (I seem to recall it was said on the former Cloudbacko forums that OneDrive imposed a throttle?), so I switched Cloudbacko to use an SFTP server and this reduced the errors to only be occasional.
Following my research quoted above regarding Google Drive, I feared that I was experiencing the same limitations with Google Drive and because I continued to get these warnings, I decided to ditch Google Drive in favour of Microsoft Azure. I’m pleased to say that I haven’t had any problems with Duplicati since doing this and I have been running on Azure for about a year now. Before switching to Azure, I upgraded my internet connection to a bigger upstream, in case it was influencing the problem, but it didn’t help.
Furthermore, I have also switched Cloudbacko over to Azure, from the SFTP server that I was previously using, and I have had no further errors with Cloudbacko either - it runs very smoothly too.

I have since decided to setup a further 2x PCs with Duplicati; one also backs up to Microsoft Azure and another backs up to the free Backblaze B2 storage (but the data on that PC is small and barely changes) - again, no issues.

So all I can say is that since using backend storage that is made for the task, everything runs smoothly and quickly with no errors. So I have concluded that a lot of the problems come down to unreliable backend storage rather than the software being used.

1 Like