403 an error on backup

Hello, today I get an error on backup:
2019-10-05 14:19:18 +02 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: Y:****\2019-10-04\accounts\szymxhnh.tar.gz
That file does not exist any more.
How to solve that an error?
Thank you

Was this backup job running for a while? Maybe the file existed when the job first started (when Duplicati scanned your filesystem) but was deleted by the time Duplicati actually got to this folder to back up this file.

Do you think that’s maybe what happened?

If so there are a couple options - one is to utilize filesystem snapshots (VSS on Windows). The other option is to just disregard the error.

Thank you, sorry my mistake, problem is with Google drive, this is remote log:

delete duplicati-b21a10e6513ff4576808bbb4e40b52fae.dblock.zip.aes
System.Net.WebException: The remote server returned an error: (403) Forbidden.
at Duplicati.Library.Utility.AsyncHttpRequest.AsyncWrapper.GetResponseOrStream()
at Duplicati.Library.Utility.AsyncHttpRequest.GetResponse()
at Duplicati.Library.JSONWebHelper.GetResponse(AsyncHttpRequest req, Object requestdata)
at Duplicati.Library.JSONWebHelper.ReadJSONResponse[T](AsyncHttpRequest req, Object requestdata)
at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.Delete(String remotename)
at Duplicati.Library.Main.BackendManager.DoDelete(FileEntryItem item)

I am using googledrive-teamdrive-id for this backup. When I manually delete duplicati-b21a10e6513ff4576808bbb4e40b52fae.dblock.zip.aes now it is working. Is there any way that Duplicati automatic delete file?

Looks like Duplicati tried to delete but didn’t have permission. Maybe someone else has an idea. I’m unfamiliar with using Google Drive for the back-end.

Agreed. If this is a shared team drive, I lack one but can try citing sources. 403 Forbidden

   The 403 (Forbidden) status code indicates that the server understood
   the request but refuses to authorize it.  A server that wishes to
   make public why the request has been forbidden can describe that
   reason in the response payload (if any).

Are you doing this through the web interface with the same login as Duplicati, or are the logins different?

Is Duplicati using your personal drive or a shared drive? If a shared, what is its access level to the drive?

Shared drives access levels

If Duplicati has Contributor access, it would be able to upload, but delete needs Content manager or Manager or 403 seems the likely result. There are other ways to get a 403 (e.g. by excessive uploading), however, for a delete, a permission error seems more likely. How many backups happened before 403?

Resolve errors lists a lot of ways to get a 403. Possibly you could find some details in a message below the error itself (which is very generic). If About → Show log → Remote doesn’t get it, you might need to view live logs at About → Show log → Live to catch it, or set up a –log-file in order to see the later lines.

Logging would also show why the delete is being attempted. If you have run backups for awhile, it might have become time for a compact, however an upload retry (after a network error) will also delete first try.

You could view your job log (Show log under the job) at failure time, e.g. is CompactResults meaningful?

Retries of uploads can be best seen by a log a Retry level (if to a log file the setting is --log-file-log-level) but can be inferred from RetryAttempts in your job log, or (I think) from job Show log → Remote doing repeated put attempts of a file that’s the same size and hash, but whose name changes (it’s renamed).

Same problem here on Ubuntu 18.04.3 LTS.

I found a fast way to reproduce the problem; if Duplicati created a remote backup by the past, and detects it when you re-use the same config, it cannot delete it.

I tried multiple AuthIDs, with both limited and unlimited access, to no avail. Using rclone delete based on a drive with one of the unlimited AuthIDs flawlessly deleted the file.

{“ClassName”:“System.Net.WebException”,“Message”:“The remote server returned an error: (403) Forbidden.”,“Data”:null,“InnerException”:null,“HelpURL”:null,“StackTraceString”:" at System.Net.HttpWebRequest.GetResponseFromData (System.Net.WebResponseStream stream, System.Threading.CancellationToken cancellationToken) [0x00146] in <2703bbaa0a6e4686b6033c2dddb1a363>:0 \n at System.Net.HttpWebRequest.RunWithTimeoutWorker[T] (System.Threading.Tasks.Task1[TResult] workerTask, System.Int32 timeout, System.Action abort, System.Func1[TResult] aborted, System.Threading.CancellationTokenSource cts) [0x000f8] in <2703bbaa0a6e4686b6033c2dddb1a363>:0 \n at Duplicati.Library.Main.BackendManager.Delete (System.String remotename, System.Int64 size, System.Boolean synchronous) [0x0000a] in <759bd83d98134a149cdf84e129a07d38>:0 \n at Duplicati.Library.Main.Controller.b__24_0 (Duplicati.Library.Main.ListRemoteResults result) [0x0010f] in <759bd83d98134a149cdf84e129a07d38>:0 ",“RemoteStackTraceString”:null,“RemoteStackIndex”:0,“ExceptionMethod”:null,“HResult”:-2146233079,“Source”:“Duplicati.Library.Main”}

I’d be glad to forward any other log/info if needed!

Welcome to the forum @draimundo

Google Shared Drive situation too? Then same questions may apply.

Team Drive for the Confused-Google Drive Vs Team Drive (Shared Drives) may also help, and if any readers are expert in such matters, Duplicati may need your help to overcome announced shutdown:

Enhancing security controls for Google Drive third-party apps means if they follow through, you lose:

and Duplicati can only access files that it originally created, which creates problems if one moved files some other way for Duplicati to use, which is actually one of the things I wonder may be possible here.

Did it create it exactly where it sits now, or was it moved? How far past? Can this be recreated today?

Definitely a Team Drive. I haven’t got Content Manager access to the Team Drive, but definitely Manager.

Yes I really didn’t intervene, (and tried multiple times to do backups, it could never remove old files…). From what I can tell the files weren’t touched at all. I didn’t recreate it today (went for a local backup and then use rclone to sync it with remotes for now), but I’m sure I could recreate it backing up a single file.

Actually, I investigated a bit, and tried the Google delete api call (used by Duplicati) on a specific file uploaded by Duplicati, and got the same error Duplicati is getting.

From this I conclude nothing’s wrong with Duplicati, I’m just wondering how rclone is doing it (maybe a different command? Didn’t look further yet), and if we maybe could implement it in Duplicati. This also may be the solution for the security restrictions in the year to come.