Fatal IOException: Permission denied” error the first time a backup job is launched

Hello everyone,

I have created a backup job for my 3TB media library. Data is backed up to an external hard drive connected directly to my NAS. During execution the job fails after one minute with these errors in the logs.

“2025-02-25 11:03:26 +01 - [Error Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error IOException: Permission denied”

“2025-02-25 11:03:26 +01 - [Error-Duplicati.Library.Main.Controller-FailedOperation]: The operation Backup has failed with error: One or more errors occurred. (Permission denied (Permission denied) (One or more errors occurred. (Permission denied)))
AggregateException: One or more errors occurred. (Permission denied (Permission denied) (One or more errors occurred. (Permission denied)))”

Here are the full logs:

            {
  "DeletedFiles": 0,
  "DeletedFolders": 0,
  "ModifiedFiles": 0,
  "ExaminedFiles": 2,
  "OpenedFiles": 0,
  "AddedFiles": 0,
  "SizeOfModifiedFiles": 0,
  "SizeOfAddedFiles": 0,
  "SizeOfExaminedFiles": 1429440013,
  "SizeOfOpenedFiles": 0,
  "NotProcessedFiles": 0,
  "AddedFolders": 2,
  "TooLargeFiles": 0,
  "FilesWithError": 0,
  "ModifiedFolders": 0,
  "ModifiedSymlinks": 0,
  "AddedSymlinks": 0,
  "DeletedSymlinks": 0,
  "PartialBackup": false,
  "Dryrun": false,
  "MainOperation": "Backup",
  "CompactResults": null,
  "VacuumResults": null,
  "DeleteResults": null,
  "RepairResults": null,
  "TestResults": null,
  "ParsedResult": "Fatal",
  "Interrupted": false,
  "Version": "2.1.0.4 (2.1.0.4_stable_2025-01-31)",
  "EndTime": "2025-02-25T10:03:26.9954316Z",
  "BeginTime": "2025-02-25T10:02:26.4467173Z",
  "Duration": "00:01:00.5487143",
  "MessagesActualLength": 96,
  "WarningsActualLength": 0,
  "ErrorsActualLength": 2,
  "Messages": [
    "2025-02-25 11:02:26 +01 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started",
    "2025-02-25 11:02:32 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2025-02-25 11:02:32 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  ()",
    "2025-02-25 11:02:35 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-baae1bda4da1d46919c99c7289c05fae4.dblock.zip.aes (49.009 MB)",
    "2025-02-25 11:02:35 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-baae1bda4da1d46919c99c7289c05fae4.dblock.zip.aes (49.009 MB)",
    "2025-02-25 11:02:35 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b02a4c21bc60f46739499ec9d184e0cf4.dblock.zip.aes (49.009 MB)",
    "2025-02-25 11:02:35 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-b02a4c21bc60f46739499ec9d184e0cf4.dblock.zip.aes (49.009 MB)",
    "2025-02-25 11:02:36 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-beb6b384582a743a79eba0ff534fe22df.dblock.zip.aes (49.008 MB)",
    "2025-02-25 11:02:36 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-beb6b384582a743a79eba0ff534fe22df.dblock.zip.aes (49.008 MB)",
    "2025-02-25 11:02:36 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bbd62e62b54c946ddb37c0b0f7dd6d6d8.dblock.zip.aes (49.008 MB)",
    "2025-02-25 11:02:36 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-bbd62e62b54c946ddb37c0b0f7dd6d6d8.dblock.zip.aes (49.008 MB)",
    "2025-02-25 11:02:45 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-baae1bda4da1d46919c99c7289c05fae4.dblock.zip.aes (49.009 MB)",
    "2025-02-25 11:02:45 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-b2cc9a3e0498c49aa9f5bf5597cce1cab.dblock.zip.aes (49.009 MB)",
    "2025-02-25 11:02:45 +01 - [Information-Duplicati.Library.Main.Operation.Backup.BackendUploader-RenameRemoteTargetFile]: Renaming \"duplicati-baae1bda4da1d46919c99c7289c05fae4.dblock.zip.aes\" to \"duplicati-b2cc9a3e0498c49aa9f5bf5597cce1cab.dblock.zip.aes\"",
    "2025-02-25 11:02:45 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-b02a4c21bc60f46739499ec9d184e0cf4.dblock.zip.aes (49.009 MB)",
    "2025-02-25 11:02:45 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-bcbb1a79548ec49938834360e50bc5646.dblock.zip.aes (49.009 MB)",
    "2025-02-25 11:02:45 +01 - [Information-Duplicati.Library.Main.Operation.Backup.BackendUploader-RenameRemoteTargetFile]: Renaming \"duplicati-b02a4c21bc60f46739499ec9d184e0cf4.dblock.zip.aes\" to \"duplicati-bcbb1a79548ec49938834360e50bc5646.dblock.zip.aes\"",
    "2025-02-25 11:02:45 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b2cc9a3e0498c49aa9f5bf5597cce1cab.dblock.zip.aes (49.009 MB)",
    "2025-02-25 11:02:45 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bcbb1a79548ec49938834360e50bc5646.dblock.zip.aes (49.009 MB)",
    "2025-02-25 11:02:45 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-b2cc9a3e0498c49aa9f5bf5597cce1cab.dblock.zip.aes (49.009 MB)"
  ],
  "Warnings": [],
  "Errors": [
    "2025-02-25 11:03:26 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error\nIOException: Permission denied",
    "2025-02-25 11:03:26 +01 - [Error-Duplicati.Library.Main.Controller-FailedOperation]: The operation Backup has failed with error: One or more errors occurred. (Permission denied (Permission denied) (One or more errors occurred. (Permission denied)))\nAggregateException: One or more errors occurred. (Permission denied (Permission denied) (One or more errors occurred. (Permission denied)))"
  ],
  "BackendStatistics": {
    "RemoteCalls": 22,
    "BytesUploaded": 0,
    "BytesDownloaded": 0,
    "FilesUploaded": 0,
    "FilesDownloaded": 0,
    "FilesDeleted": 0,
    "FoldersCreated": 0,
    "RetryAttempts": 20,
    "UnknownFileSize": 0,
    "UnknownFileCount": 0,
    "KnownFileCount": 0,
    "KnownFileSize": 0,
    "LastBackupDate": "0001-01-01T00:00:00",
    "BackupListCount": 0,
    "TotalQuotaSpace": 4000768327680,
    "FreeQuotaSpace": 3731735183360,
    "AssignedQuotaSpace": -1,
    "ReportedQuotaError": false,
    "ReportedQuotaWarning": false,
    "MainOperation": "Backup",
    "ParsedResult": "Success",
    "Interrupted": false,
    "Version": "2.1.0.4 (2.1.0.4_stable_2025-01-31)",
    "EndTime": "0001-01-01T00:00:00",
    "BeginTime": "2025-02-25T10:02:26.4467198Z",
    "Duration": "00:00:00",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null
  }
}

I’ve tried enlarging the block size to 1GB, and the transfer starts fine, ruling out an access problem.

I stopped my Plex container to avoid any parallel process just in case, without success.

I need your help please :sweat_smile:

Welcome to the forum @51473w153

Enlarging from what previous value, and what option did you enlarge?

Block size of 1 GB seems far too high and would probably be blocked.
The blocksize can’t exceed the dblock-size “Remote volume size”

Thanks for your answer.

I didn’t know the difference between the two (block and dblock). I’m talking about the size of the dblock, which I increased from 50mb to 1gb to try it out. I deleted the job to recreate another with the default size. Same problem.

Also my other jobs have the same destination and all work perfectly. Only the source differs (the media are on a separate hard disk from all my other data).

If it’s useful, I’m running Duplicati on the Linuxserver Docker image.

I’m adding the server logs here:

Feb 25, 2025 11:03 AM: Failed while executing Backup "Backup_Medias" (id: 9)
System.AggregateException: One or more errors occurred. (Permission denied (Permission denied) (One or more errors occurred. (Permission denied)))
 ---> System.AggregateException: Permission denied (Permission denied) (One or more errors occurred. (Permission denied))
 ---> System.IO.IOException: Permission denied
   at Duplicati.Library.Main.Operation.Backup.BackendUploader.<Run>b__13_0(<>f__AnonymousType2`1 self)
   at Duplicati.Library.Main.Operation.Backup.BackendUploader.<Run>b__13_0(<>f__AnonymousType2`1 self)
   at CoCoL.AutomationExtensions.RunTask[T](T channels, Func`2 method, Boolean catchRetiredExceptions)
   at Duplicati.Library.Main.Operation.BackupHandler.FlushBackend(BackupResults result, IWriteChannel`1 uploadtarget, Task uploader)
   at Duplicati.Library.Main.Operation.BackupHandler.RunAsync(String[] sources, IFilter filter, CancellationToken token)
   --- End of inner exception stack trace ---
   at Duplicati.Library.Main.Operation.BackupHandler.RunAsync(String[] sources, IFilter filter, CancellationToken token)
 ---> (Inner Exception #1) System.AggregateException: One or more errors occurred. (Permission denied)
 ---> System.IO.IOException: Permission denied
   at Duplicati.Library.Main.Operation.Backup.BackendUploader.<Run>b__13_0(<>f__AnonymousType2`1 self)
   at Duplicati.Library.Main.Operation.Backup.BackendUploader.<Run>b__13_0(<>f__AnonymousType2`1 self)
   at CoCoL.AutomationExtensions.RunTask[T](T channels, Func`2 method, Boolean catchRetiredExceptions)
   at Duplicati.Library.Main.Operation.BackupHandler.FlushBackend(BackupResults result, IWriteChannel`1 uploadtarget, Task uploader)
   at Duplicati.Library.Main.Operation.BackupHandler.RunAsync(String[] sources, IFilter filter, CancellationToken token)
   --- End of inner exception stack trace ---<---

   --- End of inner exception stack trace ---
   at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter, CancellationToken token)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass17_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

Now that we know that’s the dblock-size, does “starts fine” mean it wrote part of a 1 GB file?

Have you been changing the user it runs? This is often necessary to get access to the source.

About → System info → UserName will show. Is hard drive folder set up with any permissions?
Some filesystem formats don’t even allow this. Others might be particular about user access…

I just tried again with a size of 1GB.

The graphical interface shows me that movies are being saved, with several scrolling through. The total number of files to be backed up is also decreasing.

However, once the error has occurred and I go to the destination, there’s nothing…

I imagine it would be the same with a different dblock size.

I didn’t think to look in the destination directly, until now I deleted the job by checking the box to delete remote data at the same time.

I’m using the same user to access the data and to launch the Docker container (PID and GID in the yaml), however it doesn’t appear in the place you told me.

I use an EXT4 file system for my internal disks. And EXFAT for my external disk.

For the source, the user has the same rights on all the folders and sub-folders I’m trying to save. I’ve just checked again.

And for the destination, no authentication is required.

I may have a clue.

I manage my rights with ACLs and from what I’ve read, exfat doesn’t support them.

Except that I’ve just checked, the other folders in the destination have the same rights as the Medias folder. I’ve had no problems with those. I just don’t get it.

Even running chmod -R 777 on the destination folder doesn’t work.

It doesn’t seem to be an access right problem.

Neither of those mean anything in terms of writing the destination. It’s preparation for that.

Probably because it finally accumulated enough to write a remote volume at the larger size.

What you highlighted in red looks like About → System info → UserName. If not, what is it?

What do other containers (which seem to have write access) show? Same thing abc user?

I don’t think exFAT does permissions. I don’t do Docker (and you say you set all up the same), however it might be worth going into the container with docker exec or something to verify it provides access to the external disk and you can write a file to it (and not inside the container).

The documentation I see says exFAT (like FAT) doesn’t have permissions that you can chmod.
Please do some research on it. I see some people saying that how it’s mounted affects access.

1 Like

This is the place you told me (About → System info → UserName), but ABC is not my user.
My user doesn’t appear here

The other jobs work perfectly, and the Duplicati files are written to their destination folders.

They run with the same user, who should appear instead of ABC.
I didn’t need to enter any credentials.

I can create a file in the destination directory on the external disk, from inside the container.

I’ve tried this on several folders, and they all work.

So it’s in the right place, but you are surprised by the value it showed.

Don’t services in Docker containers typically run not as you, but as set?

Duplicati didn’t design this, but LinuxServer’s page about that is below:

Understanding PUID and PGID

Their default user isn’t root, so may need change in order to do backup.
I don’t use LinuxServer, don’t use Docker, and don’t maintain theirs, but

is probably where the abc user comes from.

While there, you can run the id command, maybe peek at /etc/passwd, and ps Duplicati to confirm the actual process UID you and Duplicati are using. If the same, this is puzzling me…

1 Like

The ABC user is correctly based on mine: same PUID & GUID.

The ./Duplicati-server processes running on my NAS and in the Docker container have the same VSZ and RSS. Everything seems to be linked correctly.

I don’t understand what he doesn’t like.

Anyway, thanks for your time.

I think I’ve solved the problem:

I’ve created the folder tree for the external hard drive on my Windows PC.
I thought that a “rights conflict” between Windows and Linux must have occurred on this particular folder.

I just deleted the folder from Windows and recreated it from my NAS with the correct user. The backup job has been running for 10 minutes, with no errors.

The mystery is: why this folder and not the others?

comedie-gaumont(2)

1 Like