What is the recommended architecture for off-site backup?

Hi, i’m new to Duplicati and want to start backing up mostly for my linux servers (backup of other windows machine on my network would be nice to have).
I have a NAS running OpenMediaVault, and an Ubuntu server 19.4 running multiple docker containers. I want to backup a (system+docker) image for both of them once a month, and do an incremental backup once a week so i can restore the servers in case of failure. In addition i also want to keep my picture folder from my NAS backed up on a 2nd location.

I was thinking of backing up both server images locally to a 2nd NAS, and both images and the pictures folder to a 2nd location.
I got a spare raspberry-pi 3 i can use with a usb mobile HDD as a 2nd location (with the Duplicati docker container for example).
My question is - where should Duplicati be installed in this case, and with what protocols? should it be for example on the local Ubuntu server, using the remote raspberry-pi as an SFTP file server? Maybe the pi should have Duplicati installed and I should use SSH keys to backup the local servers? I’m lost :cold_sweat:

Thanks in advance!
P.S - Upstream is slow via internet here (~5mbps max) if that should be a factor

Note that Duplicati2 is still under development. The latest beta is old, despite its release date. There are several serious issues to resolve before the next beta. So be sure Duplicati is not your primary backup system. It can be a secondary system, so you get used to it before a proper release is published.

That’s fine by me, as a 1TB external HDD will probably let me backup my data more than 4 times over, and docker containers are easy enough to create or destroy as needed.
Though, if its not too much of a hassle, can you recommend a good alternative based on the requirements i’ve specified?
I went over so much info and options that i’m now completely lost. My initial though was setting up one of the many local cloud solutions on the RPI3, and using Rsync from both my servers, but then i read up on BackupPc and arq and fell down an infinite rabbit hole

I would install Duplicati on the machine(s) whose data you wish to protect. In my experience this usually works best.

If you have a NAS on site then configure Duplicati to back up to the NAS. This has the advantage of fast restores since your backup data is local. But you still want the backups stored off-site. Use rclone or similar to sync the backups to an off-site location or cloud storage.

Good luck!

Thank you for the suggestion!
What would be a good solution for the RPI in that case? a local cloud like owncloud? or just a simple SFTP server?

RPI = Raspberry Pi? It may not be powerful enough to run Duplicati directly - I’ve never tried it myself. Try searching forums for other users’ experiences. If a device is too underpowered then it may be better to back up the device from a remote Duplicati install.

Yep, the Raspberry pi. I think i’ll go with your suggestion, and the Duplicati will be on the 2 servers as you’ve said, and backing up to the local NAS and duplicating remotely to the RPI. The thing is i dont know what would be the fastest and lightest option for the remote machine (pi or otherwise) to sync through, maybe SFTP?

Good question… You might want to experiment. Not sure which protocol would be fastest when targeting a raspberry pi.

I have run Duplicati on a Raspberry Pi 3, it does work, actually. However, the installation and upgrading of the Mono packages takes a very long time. Rasbian is built on Debian, and those OS’s cannot handle OneDrive for Business, as there is another thread about (that’s why I tested on the Pi).
So, if you’re fine with Duplicati running a bit slow, slow upgrades, and you’re not using OD4B, it’ll work.

Thanks for the input, but i think having the pi as some accessible remote storage and the full fledged servers doing the heavy lifting seems best. Haven’t finished this 2nd location backup setup yet so I’m not sure how well it works