--alternate-destination-marker not supported?

grossepointblank, (fun movie!) I’m a little confused - are you trying to back up “\DISKSTATION\music” to S3? If so, then I think the --alternate-destination-marker and --alternate-target-paths parameters are not what you need.

Those two parms are for helping Windows to identify local drives as DESTINATIONS for backups, not sources. The exist mainly because Windows will assign a USB drive a different drive letter if the last one used is already in use.

What happens if you remove those two parameters?

Ah. That helps. Yes, I’m trying to back up from my M:\ drive (also known as \diskstation\music) and to s3. When I run from the Duplicati web interface, or if I run the batch file interactively and without the --alternate* parameters it works fine.

I was mistakenly trying to use the flags to find the source rather than the target, as there’s a limitation in Windows Task Scheduler that prevents it seeing named network drives when running jobs (although it can see the UNC path just fine).

I originally created the backup configuration pointing at M:\ as the source - does that mean I just change the “M:\” source parameter on the command line to “\\diskstation\music\” and it will correctly resolve the source? I was wary of getting this wrong and corrupting the backup

thanks (and for getting the movie reference too)

I’ve run into these types of Windows scheduled task permission issues before and there are usually a few ways around them.

My guess is the issue here is that you’ve got M: mapped under your user login but the scheduled task is running under a different account in which M: has NOT been mapped.

There are likely a number of different ways to handle this including:

  1. As you suggested, change your backup to use the UNC reference instead of the drive letter. (Note that you may still have file permission issue once you get there.)
  2. Run the scheduled task as a different user (such as your own) that DOES have the necessary permissions. A drawback of this is if you change your user password (you do have one that you change periodically, right? :slight_smile: ) then you have to remember to go into the scheduled task and change it there too.
  3. Depending on your version of Windows you might be able to check the “Run with highest privileges” box.
  4. Add mapping of the drive to the batch file through something such as net share (see net share help or the internet for more on that) or pushd and popd

If you’re worried about screwing up your existing backup, I’d suggest making a second backup set as a test and choosing a small subfolder of \diskstation\music as the source to play with while you’re testing.

But I doubt you’ll have problems because Duplicati backups up chopped up blocks of each file, not the raw file itself - so even if you change the path or filename Duplicati will still see the same blocks of file contents.

1 Like

Thanks for liking and “solutioning” my post! I’m curious which method ended up working for you.

Thanks. The UNC pathname works - it turns out that drive lettering is a UI concept in Windows - not available to background tasks, so even if I tried “run only when user is logged in” or “run with highest priviledges” it would not see the drive letter. Found this with a bit further digging:

I’ve replaced “M:\\” in the batch file with “\\\\diskstation\\music\\” and added --dry-run for good measure. It’s running as I type. One final question then…

I understand the concept, but the log is showing lots of lines like this:

[Dryrun]: Would add new file \\diskstation\music\Turin Brakes\Ether Song\09-Full of Stars.wma, size 4.31 MB

Is that a false notification or is something else making the hash algorithm think this is a different file?

These two options are for destination, not the source, I believe.
Your destination is S3 where they do not apply.

Why don’t you use UNC path as the source?

1 Like

Because of the path change Duplicati sees this as a new file, but one it looks up the content blocks of the file it will say “hey, I’ve already got these contents stored under a different path so I’ll just say it’s now also at this path”.

Depending on the rest of your settings, like retention duration, what will most likely happen is that the M: drive files will be seen as “deleted” and in 180 days be flagged for deletion.

However, because the content blocks are still in use by the UNC versions of the same files, instead of deleting the files Duplicati will just delete the M: PATHS associated with the files.

So assuming you’re using the standard 180 retention period and you made no changes to your files for 180 days straight then Duplicati would basically see no changes for 180 days, then on day 181 just remove all the M: drive paths but leave all the UNC paths and data unchanged.

That’s a really nice solution - it would store the multiple paths in the index file, but not duplicate the dblock files. So even if I end up using both notations, as long as they’re run frequently enough there won’t be any additional overhead.

Thanks again for the guidance - I’ll run some more tests on a small sample set to make sure but pretty certain you’ve given me the answer.

1 Like

Final update. I ran the test as described, creating an original backup using M:\… as the source location, then ran the same backup from a batch file using UNC \\diskstation\music\… as the source.

Result: Works as described. Although the --dry-run output lists all the files as new, once the backup executes it does not upload any new dblock files, and only updates the dlist entry.

Batch file also tested under Windows Task Scheduler. It wakes from sleep and seamlessly runs the backup job.

1 Like

Yeah, “newness” of a file is determined by path as techinically it could be a whole new file - that just happens to have the same contents as an already backed up file. :slight_smile:

Glad to hear it worked out for you! If you get a chance please check the “This reply solves the problem” image box on the “solving” post so other users doing searches know this is a good topic to read.

Hello,
please where are --alternate-destination-marker and alternate-target-paths?
Thank you.

Hello @MQ1 and welcome to the forum!

The easiest place to find them is the Destination screen for “Local folder or drive” under “Advanced options”.

Thank you.
Duplicati works to the destination you can see in the attachment, i.e. on external HD connected to a NAS, not on my pc.
To find alternate-destination-marker and alternate-target-paths I had to change the database from FTP to local folder. After that I set FTP again and the connection works.
Now my problem is: I use two different HD (named 10211 and 10212) on alternate weeks (on monday of the first week I connect to the NAS “HD10211” until saturday of the same week, then I disconnect it. On monday of the second week, I connect to the NAS “hd10212” until saturday of this second week. On the third monday, I connect to NAS "HD10211 again).
Can you tell me if the settings you see in the hyperlink are correct?
Thank you very much.

They look unlikely to be correct. I suspect the naming might be attractive, but please look at documentation.

Local folder or drive describes their format and use for that case, but forcing them to FTP usage is unusual.

Windows Drive Letters describes cases they intend to solve. FTP fits neither. I don’t know what result will be.

What I can say definitely is that if the intent is to have one job on two drives, it won’t work, as there is heavy reliance on knowing destination content, and switching drives breaks that. A job for each drive should work. Exporting and importing can move the configuration, then you can adjust things like folder for second drive. Using a different folder on each drive is probably good because you’ll get an error instead of a wrong write, however please test a wrong-drive use on an unimportant backup. If it’s too ugly there may be another way.

Backing up to multiple USB or external drives? has further explanation about what you might be trying to do.

Hello,
I solved the problem renaming the second HD from NAS. Now both HD have the same name.
The connection to this secondo HD is ok (it works):


Unfotunately when I launch the backup appears the message below.
Can you help me?
Thank you very much.

This wasn’t solving things, it was making them worse. The differing names that you undid used to add a benefit because it made sure you got this sort of error if you connected the wrong drive for your backup. For this error, you might have done a backup to the other drive, switched drives, then gotten this error that this different drive has 6 files missing, because they’re on the other drive. To restate, two drives (is one offsite?) requires two jobs.

The local Duplicati database keeps careful track of what files it thinks should be on the remote drive because it wrote them there (and didn’t delete them). This helps it have confidence that all is well, and it can rely on other information it caches about the destination content. If Duplicati had to look all the time, performance would dive.

EDIT: I’m guessing at exactly how far things have gone, but if it’s one good backup and then this error after the switch to the second drive, the second drive might still be empty. If so, try the first drive and see if backup still works. If so, then set things up as suggested with a second job pointing to the second drive, using a different folder as you had before. If you have actual backup data on both drives already, unconfusing the situation might not be practical, and starting over again with clean backups and destinations may be easiest.

I understand, thank you.
So, I have to set a second job pointing to the second drive.
Since I use two different HD (named 10211 and 10212) on alternate weeks (on monday of the first week I connect to the NAS “HD10211” until saturday of the same week, then I disconnect it. On monday of the second week, I connect to the NAS “hd10212” until saturday of this second week. On the third monday, I connect to NAS "HD10211 again), I solved the problem starting Duplicate on the first week from default account and the second week from an other account (Guest). So each job points to only one drive.
Correct?
Thank you very much.

This text was the only part that surprised me, but I’m not sure what your goal is. If you’re running the backup manually, just have two jobs and run the right one for the week. If you guess wrong, you get the earlier error.

The automatic job scheduler is probably not configurable enough to run frequent backups then skip a week. Running backups every two weeks on each job (with initial backup on different week per job) might succeed, however that’s not a lot of backups if it matters. You could also go asymmetric, and have one job usually run then occasionally run the other job so you can take the drive and move it offsite in case a disaster happens.

Another concern about two accounts is that ordinary user accounts can typically only access their own data.

Another concern about two accounts is that ordinary user accounts can typically only access their own data.
No problem: the source is on a NAS.
I think that I can schedule the two jobs using the Windows task manager on alternate weeks.
So the first week Duplicati (whit the first job i.e. Alfa) will run from the default account, the second week (with the alternative job i.e. Beta) from Guest. Same source and same destination per each job (external drive have the same name).
What’s your excellent opinion?
Thank you.

If you meant Windows Task Scheduler, some people do use it for Duplicati, but I’m not familiar with its abilities.

https://forum.duplicati.com/search?q=%22windows%20task%20scheduler%22

I think you might be asking for trouble going back to identical drives, but there might be a different check that would avoid disaster. Duplicati by default has a good look at the destination file list to see if it’s as expected…

Please test it to see what error it gives, but don’t try to use –no-backend-verification to turn off the safety net which is probably what’s being referred to here in the How to test which database belongs to which job? topic.