if I’m not mistaken, Duplicati can only backup from local to remote and not vice versa. Why is that? I don’t see why such limitation should be in place. Right now I’m in a situation where I have to backup a NAS that I reach via SFTP to a local drive and it’s just not possible. I tried an alternative solution: I mounted the SFTP directory as a local drive with WebDrive which is said to be the best such program but it doesn’t work very well. It’s very slow and unreliable. Duplicati starts the backup but hangs after a while and it’s impossible to finish a backup job. I restarted the process 15-20 times during the last week and it never finishes, just stops and doesn’t do anything for days. I could only backup 500 GB from the 1.5 TB and I had to give up Duplicati, it’s just not working.
It can handle an SFTP connection just fine when that’s the target. Why can’t it handle a remote as source? That would be rather useful!
I suppose one of the original developers would have to give you the main reason why it was designed this way. But it is what it is.
If Duplicati is running on a Windows machine and your NAS supports SMB, another option may be to use UNC paths or mapped drives. If you have a robust NAS, it may support running Duplicati directly on it. I do this with my Synology NAS - Duplicati runs locally in a docker container.
Thanks for the tips but AFAIK SMB only works on LAN and this NAS is in a remote location. Unfortunately, it is too weak to run Docker, so no Duplicati on it. It would be possible to hack Docker into it but I don’t have the time for that. And I would still need to run a server on my computer to make backwards access from the NAS possible. And DDNS. So it’s a lot of work for a very inconvenient and not so sure solution. On paper, this WebDrive workaround should have worked. Yet it didn’t.
rclone is my favorite tool for handling copies to/from almost any type of storage. SFTP is supported. Unfortunately it’s just a sync tool, not a backup tool. You’d need something else to handle versioning. Some object storage can do that for you (S3, B2, Azure storage account, etc), but I don’t know if that would fit your needs.
rclone is also very symmetrical, which was an original question. Backups are usually local to elsewhere.
Some of this might be due to demand, some might be technical difficulty. You would have to scan whole filesystems remotely over the Internet, and download complete files to try to find the changes to backup. Duplicati only backs up changes, but needs to read whole files to find them. Restore is block-based and requires random access to patch blocks in. SFTP (and many Internet protocols) work with entire files…
The backends encapsulate the actual communication with a remote host with a simple abstraction, namely that the backend can perform 4 operations: GET, PUT, LIST, DELETE. All operations performed by Duplicati relies on only these operations, enabling almost any storage to be implemented as a destination.
The remote side is very basic, as above. The local side is sophisticated. It even backs up file properties.
If you’re willing to rclone to somewhere Duplicati can be, then backup from it to elsewhere, that will work. Restores may get awkward because you’d have to reverse the usual flow to put files back onto the NAS.
rclone Feature request - versioning has a discussion on how to do it without special destination support.
I vaguely recall the forum might also have a script from someone trying to turn rclone into a backup tool.
Sync programs appears a little more flexible than backups, maybe because they do complete file copies.
FreeFileSync Detect Moved Files and SFTP suggests it can SFTP to PC, but I sure haven’t tried doing it.
There’s seemingly some support for versioning, although Duplicati’s changes-only store is much smaller.