Computer -> Local server -> Cloud

Basically, I’d like to backup my computers to my local NAS, and then just have that backup to B2, since my local network is much faster than my internet upload speed. So I thought it would make sense to backup each computer locally, and then backup the backups online. It just makes more sense to leave my always-on NAS as always-on rather than have each laptop/desktop running for days or weeks while my upload completes. The backup of the NAS would be only for disaster recovery purposes (flood, earthquake, etc). Is this doable?

Theoretically you could just have Duplicati back up your NAS’ backup files - after all, they’re just encrypted zip files. Just keep in mind that every time a new update is pushed from one of the endpoint computers, the backup files would change - though i think the storage system might be such that only some of the files would be changed and thus need to be re-pushed to the B2 archive. Hopefully others will shed some light on this as I’m still a bit new at Duplicati.

I suggest not to use Duplicati for both backups. If you backup an endpoint computer to your NAS, you have a bunch of encrypted files on your NAS. If you use Duplicati again to make a backup of these backup files to another location (cloud), those backup files will be fragmented to blocks, archived and encrypted again.

If you ever want to restore something from your cloud backup, you have to restore the complete contents of your Duplicati cloud backup folder to your local computer and use Duplicati on your local computer to be able to restore a single file from your backup.

I guess one of these methods would work better:

  • Use a file sync tool to sync files from your endpoint to the NAS and use Duplicati to backup those files to the cloud.
    Pros: You can use the same instance of Duplicati to backup and restore files to/from the cloud using the configured backup job and the local database of Duplicati.
    Cons: Some features like shadow copies will not work, depending on the capabilities of your file sync tool.
  • Use Duplicati to backup files from your endpoints to your NAS and use a file sync tool on your NAS to upload the backup files to the cloud
    Pros: All functionality (like shadow copies) are available on your endpoints, you have “real” snapshots directly available on your endpoints (so restoring a file from an older backup is no problem).
    Cons: In case of a disaster you cannot use your Duplicati installation to restore files directly to the cloud. Also, you have to trust your file sync tool. If anything goes wrong and backup volumes are not uploaded correctly, it can make your backup unusable.

I also have this question, thanks Kees-z for your reply and your second option would work for us best (using Duplicati to backup locally, then file sync to our S3 storage). I’m new to Duplicati, is there a guide somewhere on how to set it up for local backup? I only installed it yesterday and the browser management seems to only mention cloud destinations, I assume I may have to use the command line?

Thanks in advance, Vic

The Getting Started Guide is a good starting point:

See the other articles for more information:

Watch these tutorial videos to see how a basic installation works:

1 Like

Thanks a lot, sorry for asking the obvious!

I’m using Duplicati to back up to my Synology NAS and from there Synology’s Cloud Sync picks up the Duplicati files and uploads them to my StackStorage 1TB account using WebDAV.
As a test, I used Cloud Sync to download one of the backed up folders and restored that folder with Duplicati back to my PC. After Duplicati completed the restore process, I performed a binary compare of the original folder and the Duplicati restored folder with Beyond Compare……and there was no mismatch.

1 Like

Good to hear! :smiley:
Can you also restore files when connecting Duplicati directly to Stack using WebDAV? Something like a restore in this video:

Yep, no problem at all to bypass the Synology and restore straight from StackStorage to my PC.

1 Like

Thank you for the question and the answers, I have been mulling over the exact same thing for a couple of days.

I thought of doing it this way at first, but have changed my mind to the other approach now because I considered how a ransomware-afflicted endpoint would operate. I think such an endpoint could also corrupt the duplicati data on the NAS and if I don’t catch this before the NAS is synced to the cloud then all my data and backups could get corrupted.

By using the “sync endpoints to NAS and use duplicati on the NAS to do the backup-to-cloud” approach I think I am less susceptible to a ransomware attack because the backup data in the cloud would contain both the original and corrupted versions of my files.

This is on the assumption that endpoints are more likely to suffer from a ransomware attack than the NAS or cloud itself. Is my reasoning sound?

Hi Dave,
As you could have read, I’m using Duplicati on my Windows system to backup to my Synology NAS and my Synology is totally hidden in my Windows environment as I have not a single share active on that system. In case I’m attacked by ransomware (chances are small as I’m also running RansomOff) and Duplicati should start backing up my ‘ransomware attacked files’ to my Synology, I can restore all files -1 either from my Synology or straight from my cloud storage (in case the ransomware should have crippled the files on the Synology).
The reasons I have come to this setup are:

  1. I could not get Duplicati working on my NAS (still have to dig into this issue…)
  2. Once Duplicati has transferred files to my NAS, almost instantaneously, Synology’s Cloud Sync starts uploading the files to my cloud storage.
  3. In case I have a lot of files/updates to back up, I can turn off my PC once Duplicati has finished the job. As my Synology is running 7x24, it will process the new uploads while my PC is turned off…
  4. Backing up to my NAS server goes much faster (1Gb/s connection) than directly to my Cloud Storage (just 30Mb/s upload).

Hope this helps…

1 Like

Many thanks, Dick. I don’t have a Synology NAS (yet!), so perhaps I’ve misunderstood how it will be configured: if your Synology NAS is totally hidden from Windows, how does Duplicati write data to your NAS?

By specifying the UNC \\synology\shared\Duplicati\’ in the folder path of the destination. So there is no drive letter linked to my NAS share… As far as I know, when ransomware hits your PC, it will go through all the mapped drive letters to find target files to encrypt.
As mentioned before, I have RansomOff installed and this program also offers folder protection. So, some important folders I have protected with this program. Note that the upcoming Windows Fall Update is going to include folder protection too as part of Windows Defender.

Using a UNC path to a shared folder on your NAS doesn’t safeguard you from ransomware. Admitted, many ransomware variants use mapped drives to rapidly find files that can be encrypted, but there are more ways the attacker can use to inventory potential targets to infect. A simple scan of the local network can reveal the existence of the NAS (including manufacturer etc., so protect the config of your NAS with a strong password!) and available file shares to the malware.
If you type net use at the command prompt, there is a good chance that you will see one or more shares without a mapped driveletter (especially if you previously filled in an authentication window to get access to the share).

For better protection:

  • at least deny access to the share with any credentials you use for anything on your computer.
  • Better, disable SMB/CIFS/Samba or whatever it is called by your NAS, to prevent a connection to the share via a UNC path. Give access via another protocol that Duplicati supports, like FTP of maybe WebDAV. Allow acces for only one user with a strong password that is only used by Duplicati.
  • Newer NAS models have a BTRFS file system. If yours does, check if shadowcopies/previous versions is enabled for the shared folder containing your Duplicati backup files. If anything goes wrong, you simple can revert back to max an hour ago (or even multiple years ago). All Netgear and a growing number of Synology NAS devices use BTRFS.

This makes it hard for ransomware to encrypt the backup files on your NAS. Most infections take place on your local harddisk and network shares via SMB. If the ransomware finds the FTP folder on your NAS, it has to be aware of the existence of Duplicati and dive into the configuration to find the credentials for your backup target. This still is a potential weakness, but it is very unlikely that today’s ransomeware makes use of the Duplicati config files to find credentials for a shared folder on your network.

1 Like

@kees-z, many thanks for your comprehensive and appreciated description!
As mentioned before, I’ve also RansomOff installed that should (hopefully??) protect me against any ransomware attack. RansomOff protects files on local drives, removable devices and networks shares and even protects the Master Boot Record (MBR) from malicious overwrites. So apart from being very careful, I hope that this software will also help me in case of…
All important files on my NAS are also backed up to the cloud by Duplicati and so I should be able to recover my files even when they have been encrypted by ransomware.
I have RansomOff installed but I know there are more (free) software products that can protect you against a ransomware attack.

Whenever I want to copy some files to the Synology, I’m using an FTP connection…

FYI, this is the result of the command you suggested:
C:\net use
New connections will not be remembered.

There are no entries in the list.

@kees-z thanks for tip! I’m going to install Webdav server on my Synology to avoid the UNC in Duplicati… :slight_smile:

I will also be testing this kind of solution while waiting for Duplicati to implement multi-target…
It is feasible to have local copy of the backup in order to improve restore speed, so backing up to common location (in separate folders) from several computers and then using rclone to sync new files to cloud immediately should work fine.
If additional isolation is needed, common share can be protected with separate account or use SFTP/WebDav…

Wonder if anyone else uses this kind of scenario…