Permission denied on sFTP after unrelated config change

After changing the remote-volume-size for some of our backup jobs i some of them now have the following error, when trying to backup to our sFTP Storage.

I made this change for all 7 of our Windows Servers which have the same job running, next day 3 of them have this error. The Access to the sFTP Storage is otherweise working fine and no other changes have been made.

  1. Febr. 2022 13:42: delete duplicati-b6366a9e5b3444cbc9996b4bb636f3c49.dblock.zip.aes.part
    Renci.SshNet.Common.SftpPermissionDeniedException: Permission denied
    at Renci.SshNet.Sftp.SftpSession.RequestRemove(String path)
    at Duplicati.Library.Backend.SSHv2.Delete(String remotename)
    at Duplicati.Library.Main.BackendManager.DoDelete(FileEntryItem item)

It looks like it’s failing when it tries to delete remote files. Did you check permissions on your Windows Server / SFTP service configuration? Maybe deletions are being prohibited.

I am going to speculate that changing the remote volume size may have triggered Duplicati’s “compaction” event where it tries to minimize the number of files and the wasted space on the back end. During compaction files may be deleted so it should be allowed.

The sFTP permissions didnt change and the same account is working for identical jobs on other servers. It must be related to the volume-size change, but i cant find the cause.

Can you double check deletion permission? Use a regular sftp client. Authenticate with the same credentials Duplicati uses, and then try uploading a file and then try deleting it.

Its definitly a duplicati issue and not sftp permissions and i found out some more things.

Affected are folders inside the directory on the sftp. After deleting those folders manually and rerunning and repairing the job, i looks like i can fix it that way. But jobs arent finished yet.

We have one sftp with subfolders for each server where duplicati is running. So each Job has its own directory, is it normal that jobs access or work with the other directories too even if its outside their subfolder?

The sctructure is like this:
Job1 = sftp/folder1
Job2 = sftp/folder2

but when i run job 1 there are also folder changes happening in folder2, is this supposed to work like that or might this be an error?

Absolutely not. You should have it configured as you do - each job on each PC writes to a unique subfolder. I do it this way myself. If a job is touching files in the “wrong” folder, then something is wrong with the backup job configuration.

Would you agree the folders named similiar to “duplicati-b6366a9e5b3444cbc9996b4bb636f3c49.dblock.zip.aes.part” etc. shouldnt change (date, etc.) unless the specific job is running.

I just checked the Backup Jobs and the Target directories are definitly set to the specific subfolders. They dont share any Folders at all.

Duplicati never “changes” files on the back end at all. After upload the file will stay there and Duplicati will never modify its contents. The only thing Duplicati may do at some point is delete the file. Deletions can happen during compaction or pruning operations.

You actually have a folder with that name? Duplicati rarely creates folders, but makes a lot of files.
Name cited would be a typical name of a dblock file except for the excess .part suffix on the end.
Do you have many such names? As I understand it, sometimes servers will use that for temporary
filename during upload, but rename it to delete the .part suffix after upload is done. Does yours?

What sort of SFTP server is this, and does it allow any logging to give its view of what’s going on?
What you should get in a backup is a flow of dblock and associated dindex, then a dlist at end.
You can get Duplicati’s view with a log, e.g. About → Show log → Live → Information, however the filename is the simple filename without any sort of full path. You can at least look into .part issue.

Are job 1 and job 2 running concurrently? That would explain why folder 2 is also getting file uploads.

The Folder looks like this, and i realised the “Permission denied” error is duplicati trying to remove those folders. If i delete them manually, with the same account by the way so permissions are there. The Job works again atleast for sometime.

grafik

Inside one of those Folders:

A bit more information.

This is a newly setup job running for the first time, folder was empty beforehand.

The Job creates these folders:
grafik

But doenst run successfully, at the end the following error appears:
grafik

The 4 remote files are exactly those 4 folders, if i run repair it files with “Permission denied” for those. After deleteing them manually and rerunnning the job, it seems to work. But after sometime the same cycle seems to start again.

Is this possibly some kind of timing issue? That duplicati is trying to delete the file while its still in use?

The initial change was setting the remote-volume filesize from 50MB to 1GB, because there were so many file. Do bigger file have some kind of effect here?

What would help too are answers to previous questions, such as some information about what server is (looks maybe Linux-based, as I see forward slashes – and the picture shown doesn’t look like Windows Explorer), information from Duplicati logs, and (somewhat stretch goal) whether the server has any logs.

Are there any creation times in whatever is doing the display (or otherwise)? There is likely some order where dblock and associated dindex files are uploaded, with dlist at the end (as was explained before).

You have four things that have icons somewhat suggesting a folder, with contents inside totaling about expected 1 GB (kind of large – has drawbacks on restore because you might have to download many) total size. Problem is that I don’t think Duplicati ever uses all-numeric sequential names like you found.

The folder-looking things look like they have names like four of five of the aes suffix files Duplicati sent, except for their extra .part suffix (mentioned before as what some servers can do while taking upload).

If you check Duplicati or server logs, you could maybe see if Duplicati sent them, but it doesn’t use such names, and I’m not seeing that the SFTP client library does either or that it uploads into a folder, though SFTP client library behavior was based more on source and issues (did someone ask) search, not tests.

If whatever this server is can show times, you should probably see something growing during the upload Duplicati does (as seen by its log). The unknown is whether you will see it grow in place up to 1 GB, or if growth will be in a .part folder of that name (e.g. the numbered file), then at end maybe final file arises.

Duplicati definitely doesn’t like any leftover stuff in its name space (duplicati- prefix) in this case, and my guess is it lacks ability to delete folders (it can create them, but generally does only at initial backup).

If I manually create an empty folder on Windows, having the name of a dblock file plus an added .part suffix at the end, and try to run backup, it pops up a red error Found 1 remote files that are not recorded in local storage, please run repair. If I run Repair, and view live log, it retries up to:

Feb 25, 2022 9:42 AM: Operation Delete with file duplicati-bc6e6df4e2dc846f5aa53f591a42ce656.dblock.zip.part attempt 5 of 5 failed with message: Access to the path ‘\?\C:\ProgramData\Duplicati\duplicati-2.0.6.100_canary_2021-08-11\RUN\test A\duplicati-bc6e6df4e2dc846f5aa53f591a42ce656.dblock.zip.part’ is denied.

So here it doesn’t have to be a permission error, just that one can’t delete a thing that’s actually a folder.

Please do some of the suggested tests to narrow down when and how the unwanted folders are arising.

Thanks alot for you detailed response, we decided to roll back the change and set the remote filesize back to 50MB. Analyzing this any fruther is just too time consuming and its important for us this backup jobs works, which it does configured like this.