System.Exception: 401 - access_denied

I"m getting the following failure every time I run Duplicati (2.1.0.5_stable_2025-03-04)

  • It runs for quite a while, but errors out before actually backing up any files.
  • I’m on an Intel Mac running Sequoia 15.5.
  • I backing up to BackBlaze (and the connection is verified)
  • I set Backup Retention = Keep all backups.

I’ve looked thru a bunch of forum posts, but haven’t found anything that seem related.

Any help would be much appreciated.

System.Exception: 401 - access_denied: Access Denied
at Duplicati.Library.Main.BackendManager.Delete(String remotename, Int64 size, Boolean synchronous)
at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable1 protectedFiles) at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable1 protectedFiles, Boolean logErrors)
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(String backendurl, Options options, BackupResults result)
at Duplicati.Library.Main.Operation.BackupHandler.RunAsync(String sources, IFilter filter, CancellationToken token)
at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
at Duplicati.Library.Main.Operation.BackupHandler.Run(String sources, IFilter filter, CancellationToken token)
at Duplicati.Library.Main.Controller.<>c__DisplayClass17_0.b__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Backup(String inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

Welcome to the forum @sgerardkr

Is this an initial backup, an old backup that suddenly had trouble, or what?

I’ll assume that’s the storage type you chose. Backblaze also supports S3 these days.

If you mean the Test connection, it’s a very basic check, usually just login for file list.

RemoteListAnalysis would also be looking at the file list, and reacting to what’s seen.

Backblaze seems like it doesn’t like the file deletion. It can be set up to block those, but default for an application key would not do that. Did you configure key in a special way?

If it’s been awhile and you forgot, you could set up another key to see if it can do delete.

You can also test put and delete using BackendTool and some filename named so it doesn’t look like a Duplicati file (typically beginning with duplicati-). For a full test, the BackendTester can test an empty folder. Start with Export As Command-line for a URL.

For a better view of what leads up to the failed Delete, watch About → Show log → Live → Information. There’s presumably a reason for a delete, but why B2 refused is the mystery.

Specifically this part says that Duplicati does not have permissions to delete files on B2.

If a transfer fails, Duplicati will make a new filename (never overwrite) and repeat the upload.
In some cases the failure will leave a (partial) file on the remote storage that needs to be deleted.
Could it be the case here that Duplicati wants to delete that specific file, but you do not grant it permission to do so?

Thanks @kenkendk. As part of my protection against malware somehow deleting my backups, I configured BackBlaze to never allow deletes by anyone for 1 year. So, yes, BackBlaze is likely denying delete.

But I configured Duplicati to never try to delete anything

I thought that would prevent the problem, but it doesn’t seem to.

Not really. You configured it to not delete backup versions. Read what was said before:

If an individual file needs to be deleted, it is marked in the database and (I think) deleted exactly as per original post, as part of cleanup before the backup itself is allowed to start.

Have we heard the file name yet? You can probably get it from a live log or a disk log-file. Best case would probably be it’s a dblock file, because dlist and dindex want things.

EDIT 1:

If you like, you can get a tool like DB Browser for SQLite to look at (safest to use a copy) your job Database. If Remotevolume table has a row with Deleting as its State, that’s probably the file that is trying to be deleted. You can make Duplicati stop trying a cleanup, however depending on what type of file it is, and how bad it is, it may be a future problem.

I’m running “Verify Files” now. I’ll post against when that’s complete.

Although individual file retries (and delete of old tries) can happen, another delete source is compact which can try to coalesce and delete files that are too small (level is configurable):

--small-file-size=<int> Files smaller than this size are considered to be small and will be compacted with other small files as soon as there are <small-file-max-count> of them. --small-file-size=20means 20% of <dblock-size>.

This can be avoided by setting the no-auto-compact option (which doesn’t stop efforts for cleanup of bad files). It’s sort of possible to guess which case you hit, especially if you look at the database to see what it’s wanting to delete. One file may suggest single upload error. Timing is another clue. Compact would run after the backup, but if deletion fails, retry would happen before next backup, so you’d have to go back in logs to look for first delete refusal.

1 Like

I added no-auto-compact and it seems to be running. It’s certainly running longer than previously. Fingers crossed. THANKS!

Thanks @kenkendr and @ts678 for your comments. I added no-auto-compact and everything seems to be running again. THANK YOU !

2 Likes