Simplest way to do a self-hosted remote backup

I’ve been using Duplicati for years for local personal backup. Now I would like to set up a remote backup to a hard drive at a family member’s house. I need about 2tb of storage. What is the simplest reliable way I can do this? I see that Minio is popular on here, but it looks like a bit of a hassle to set up with SSL certificates and stuff. Do I really need encryption since Duplicati already encrypts everything? I figured I could just use FTP, but people say it is error prone?

Thanks in advance.

The options you have might depend on the hardware you have available. Some simple NAS could not support all protocols.

My main concern would not be the backup data, as that is encrypted (obviously use a strong random passphrase), but the connection to the local network that needs to be secured. Since you will need to open a port or have a VPN connection, this could potentially expose an attack surface. You should do your own research, but as far as I am aware I would prefer SFTP (using SSH connections) over FTPS (FTP with TLS encryption). FTP was not designed to be secure, so it would do things like send your password to access the file server in plain text. You could do that in a local network or with a VPN, but why risk it?

I am not familiar with minio, but the setup will probably be more involved than setting up a SSH connection. There have been some issues with supported SSH ciphers, because some were removed from the defaults and the new ones were not supported yet. The ed25519 keys should work reliably for most version combinations.

1 Like

Thanks Jojo,

So you are saying that if I used plain FTP somebody could somehow snoop on my internet traffic and discover the password to the FTP site and delete my backup, or add their own files to the server? Seems unlikely to happen. And/or you are saying that simply opening the necessary ports to access the FTP would make the computer hosting the FTP more vulnerable to some other kind of attacks?

I was more wondering if there are any technical limitations with FTP to be aware of for backing up more than a TB of data.

I just googled this, what you said seems about right:

The security of the PC will depend on the security of the FTP server and permissions you give it.

There is also an issue that some ftp servers limit the directory size (and just cut off the list, or do other weird stuff), so you would have to work around that by making volumes quite big. I would just go with SFTP because it is also a better protocol design (also mentioned in the link above). It should be trivial to set up on Linux.

OK. SFTP was easy enough. I didn’t have to do anything special. I just had to forward the ports on the router, give the server PC a static IP address, and I used DuckDNS with a scheduled task in Windows to have a consistent URL to access the server remotely.