Backup to Remote QNAP device

I’m curious if anyone has done this/has suggestions on how to set it up. A few years ago, I bought my parents a QNAP TS-431 NAS. They’ve never really gotten around to fully utilizing it, but I want to try and push them a bit. In the mean time, I’d also like to try setting up some clients (starting with Duplicati) to back up to this NAS remotely over the net. ie. So I can get each of my siblings set up with remote backups.

I have gone and setup myQNAPcloud on the thing, so I can remotely manage the device, but I’m not sure about my options in setting up remotely accessible storage. Ideally this would be done without any port forwarding, but it’s not a big deal if required.

As long as the QNAP can be accessed via something like a drive mapping (not recommended for internet use), FTP, SFTP, etc. you should be able to use it as a destination.

Most likely you’ll need to do port forwarding for whatever technology you decided to use for the destination (usually something like port 21 for FTP, port 22 for SFTP, etc.).

There seem to be a dozen or so posts about QNAP machines, some of which imply using destination technologies such as the ones below, so it’s definitely doable.

Perhaps @Kahomono or @marbletravis have some suggestions that could help get you started. If you do get it going, please consider posting how you did it here as a #howto to help other users out! :slight_smile:

I did it old-school: command line and SSH keys. I back up to my QNap over the Internet using SFTP. It is very trouble-free.

I prefer not to use IoT “cloud” services, because I do not tend to give them the benefit of the doubt on handling my data safely or in my best interest. I usually forbid things like QNap from accessing the Internet at all.

In this case, I have very specific firewall rules on both ends restricting the QNap to doing only my backups, its firmware updates, and once a quarter, a Let’s Encrypt certificate update.

Its been a bit, but I don’t think I ever got it working. I ended up just using S3. That being said, I am game to try again, if I have a moment this weekend, I will give it another whirl.

In theory it should work, QNAP does have an app that creates S3 compatible storage endpoints. I have gotten it to work with another backup solution, just not duplicati.

Sorry to say that Duplicati don’t work correctly with Qnap S3 compatible storage endpoints (internal app Object Storage Server) see here

For now I have installed Minio server in a Qnap container, in this way Duplicati works pretty well

Looks like I’ll have to go old school like @Kahomono suggested. Ideally I would have done something S3 based, as I think I’m going to setup some Minio instances on my unRAID server to provide remote storage to the rest of the family, and having some consistency would be nice. But the QNAP model that my parents have seems to be too under powered to support all the good apps they’ve put out (containers, etc.).

I’m trying to be a little proactive here and provide a framework in which to allow people to update based on their own experiences - feel free to use it, ignore it, or tell me it’s not useful at all and should be deleted. :slight_smile:

If you like the QNAP How-To idea, please let me know and maybe we can start one up for unRAID (which I use and have set up an unused-Minio and little-used NextCloud on).

So I did play with this a little bit. I currently have the device on my local LAN, so I could test things without having to worry about it leaking on the net.

I turned on the SSH service on the QNAP device, and verified that the SFTP service was also enabled. I then created a new user for this on the QNAP, added the user to the administrators group, and then added the user as an authorized user of SSH. I then created a new share on the QNAP, and verified its exact location in the file system.

Within Duplicati, I created a new backup job, using the SFTP (SSH) storage type. Set the server:port to the local IP of the QNAP device, on port 22. I then set the server path to /share/MyShare/, and entered my username and password. I then clicked the ‘Test Connection’ button, and was presenting a prompt asking if I wanted to trust the host certificate. I clicked Yes, and the test reported success. A new Advanced Option was automatically added for the destination for ssh-fingerprint, with the cert info from the QNAP device. I finished adding the backup job to Duplicati as normal, and then initiated a test run.

What I found was that the performance was absolute crap. The data set was only about 100GB in size, but after an hour or so, it was still working. Duplicati was reporting that it it was processing about 7-8MB a second. I looked at the performance monitor on the QNAP, and the CPU was reporting anywhere between 75% and 95% utilization. I decided to cancel the run in Duplicati, but it would seem that this particular QNAP’s CPU just isn’t up to the task for decrypting the SSH traffic. That said, I did see all the dblock and index files on the NAS that had been uploaded to that point.

So functionally, this would work. That said, I’ve still got a couple concerns:

  1. I’m not super-familiar with SSH/SFTP. However, my basic understanding is that all SSH connections are encrypted by default, so I don’t need to worry about a third party sniffing the traffic. Can someone confirm?
  2. To setup the remote connection, I’d need to do a port forward to the SSH port. Given that it’ll be running behind a SOHO basic router, it’ll be vulnerable to brute force attacks. @Kahomono seems to get around this with firewall rules, but unfortunately my parent’s basic router can’t do these types of things. As well, it doesn’t appear that the QNAP OS supports providing restricted SSH access (instead only allowing SSH from full admins). So an attacker would have full access to the system.

Given the limited performance, and the security implications, I’ll be taking a pass on fully setting this up at this time. Nice to see that it’s “possible” though.

Yes - unless you try REALLY hard (and maybe even not then) SSH (and thus SFTP) connections are encrypted.

As for worrying about third party sniffers, well… that depends on your level of paranoia. :crazy_face:

Pretty much. You’d likely be a bit safer using keys instead of passwords, but if the QNAP really can’t let you lock down an account to specific folders then, yes - that should be taken into consideration.


Performance wise, I’m not really sure what’s going on. The QNAP should just be a a destination and shouldn’t require that much power. I think some people are using RaspberryPi’s as destinations!

I’d suggest doing a test job to a local drive and seeing what the performance is like for that - there maybe something else causing the problem here.

Backups from the same source to my unRAID server using Local Destination/SMB run at around 50MB/s. As I said, the CPU on the QNAP itself was pretty much pegged while the backup was running/transferring data. The second that I cancelled the backup, the CPU went back down to practically nothing. Simple test, but seems pretty definitive to me that the CPU just can’t keep up with the SSH decryption. If I was really concerned, I’d ask QNAP more about it, but with the other issues, it just doesn’t seem practical.

I guess this is a case of the hardware specs looking good enough but in reality it just doesn’t cut it. :man_shrugging: