Old backup files cannot be removed from a secured Synology Samba share (bug?)

Hello!

I have created an account for backup purposes on my Synology NAS. I have put it into all backup jobs and it works fine for backing up. Nonetheless, it appears that this account isn’t used by the application to clean up obsolete files.

MacOS, Duplicati 2.0.2.1_beta_2017-08-01

Thanks,
Haron

When you say “clean up obsolete files” do you mean removing old versions of files from your backups or do you mean removing actual backup (dblock) archive files?

How are you determining that that the account isn’t being used?

When you say “clean up obsolete files” do you mean removing old versions of files from your backups or do you mean removing actual backup (dblock) archive files?

The second: “Access to the path “/Volumes/auto/XXXX/xxx-bd673be617fcc4406b0590d10f52ec427.dblock.zip” is denied.”

How are you determining that that the account isn’t being used?

It’s just my guess as backup procedure works well to ADD files, nonetheless it can’t remove them anymore since I’ve specified the backup account in the task settings.
I’ve also tested it manually, and I confirm that this account can delete files from the share.

Do you mean you added a --backup-name --prefix Advanced options parameter?

If so, then the problem may be that it’s using the backup-name prefixed naming convention to try and delete a dblock.zip file with the old/default prefix of “duplicati-”.

Nope, it’s not about these params, it’s Username and Password for “Backup destination” GUI page.

OK, so much for that theory. :thinking:

Are you using FTP, WebDAV, UNC, or something else to connect to your Synology NAS? I ask only because there has been at least one report of other issues related to UNC usage (Fatal error during backup to Synology via UNC share - #9 by Dick_Hoogendoorn).

I’m using SMB shares. They are mounted on my Mac as Volumes therefore I’ve specified the volume name in Duplicati settings (nothing like “smb://…”). Have never experienced any problems with data exchange. Only now (with obsolete dblock) as I’ve decided to use a special account for backup purposes. It’s because my own account doesn’t have RW access to the shares anymore for extra data protection.

Ahh, so you originally ran it under your own account then switched it to a custom backup account?

Is it possible the custom backup account doesn’t have permissions to delete the original files created with (and I assume owned by) the first account?

If that’s the case, then IN THEORY you should only have to have the backup account “take ownership” of the files…

Ahh, so you originally ran it under your own account then switched it to a custom backup account?

Correct!

Is it possible the custom backup account doesn’t have permissions to delete the original files created with (and I assume owned by) the first account?

I’ve just tried that: mounted that share using the backup account, moved an old block file from there without any problem. I think that Unix permissions don’t work via SMB.

If that’s the case, then IN THEORY you should only have to have the backup account “take ownership” of the files…

I can’t even see such possibility in Synology DSM.

What can we do to debug that?

I’d start by making a test backup in the same way as the one that’s failing (but with a smaller source, to a unique destination folder, and without scheduled backups). Note that if a user account change happened with the erroring one, out should be replicated with the new one.

Run the job, change the user account, then try deleting the job (including remote files).

You were right. This problem related to every Duplicati operation with Synology Samba shares. Which means I have no backups for past days at all!
What I did to check it:

  1. New account with simple alphanumeric password with read-write access to the backup share.
  2. New simple task to backup a small folder and put it onto the share as a mounted drive (/Volumes/backup).
  3. Run the task
    It was unable to upload!
  4. Unmount the share (was using my account), mount the share with the backup account.
  5. Run the task
    Success!

Which means Duplicati cannot use the account properly. Can we fix it ASAP?

First we need to verify other users have the problem and that it’s not specific to your setup or a particular Synology DSM version. Do you know what version of DSM you are running?

I’m going to update the topic title to indicate it’s Synology issue in the hopes that somebody who has access to one can replicate the failure.

I should note that I believe some other users who have had Synology Samba / UNC share issues ended up shifting to FTP before we could identify the source of their problems.

Hi again,
I was thinking about the issue for a couple of days. It appears that the issue is related to how Duplicati works with network drives on MacOS. It’s not about Synology as any network shares experience the same problem. Look:

  1. When I log in, MacOS automatically mounts the NAS share for backups (\Volumes\backup). Remember: now it’s read-only.
  2. When Duplicati start the backup task, it’s trying to use the same share (\Volumes\backup). The fact that I’ve put another credentials in the task settings is irrelevant as the share is mounted already!

As far as I understand, it should be possible to specify the share network name, not local name. I.e. something like “smb://nas/backup” instead of “/Volumes/backup”. Only is such a case Duplicati could mount the share using the correct credentials.

Another idea is to always mount the share using backup credentials. Which is completely stupid as the reason for this special backup account was to zero chances of backup modification (because of human mistake or encrypting virus).

What do you think?

I think, now I’ve found the cause of the problem. Please read my updated comment above.

One more idea: is it somehow possible to specify pre-back and post-backup actions? In such a case it won’t be a big problem to mount the share using the proper account, and unmount after the backup operation.

You can use the --run-script-before, --run-script-before-required and --run-script-after.
--run-script-before-required will abort the backup job is the scripts end with an error level other than 0.
Use --run-script-timeout to specify how long Duplicati will wait for completion of the script. Default value is 60 seconds.

2 Likes

Still thinking about this issue. Look:
There are a number of tasks (about 10) scheduled at, say, 0:01, 0:02, 0:03 and so on just to be sure that they will run in the proper order. Actually, all of them take many minutes (the last - up to 2h). As far as I understand, there is a scheduler / queue in Duplicati, therefore they won’t simultaneously, that’s good.
I suppose, instead of running mount and unmount before and after each task, it would be great to mount the share as the first (virtual?) task and unmount as the last (virtual?) task. Hence, I’m going to create two tasks:

  1. At 00:00 (before any real task) to backup nothing, just run a script to mount the share.
  2. At 00:10 (after any real task) to backup nothing, just run a script to unmont.
    Is it a good plan?

I expect that would work as you describe, though you might want to check if the share exists before trying to mount it - just in case something happens causing “yesterday’s” step 2 to not fire thus leaving the share mounted.

1 Like

Strange, if works fine when I run from Terminal but Duplicati doesn’t run it at all:
run-script-after = "/Users/haron/remote.sh off"
What am I doing wrong?