Backing up Duplicati backups

Hi,
I want to backup my file server that has loads of Duplicati backups on ( I host several peoples backups).

My plan is to use lftp over SSH to backup the Duplicati backup folders to a remote server. Just in case mine gets pinched or goes faulty.

Is this ok to do, or will I wind up with corrupt data?

Duplicati backups can be moved around freely, but make sure your method of moving doesn’t corrupt files. For example, FTP can be unreliable both because it can truncate files without noticing and because it has an ASCII mode that is not suited for binary files such as Duplicati needs (it does also have a binary mode).

Another thing to consider is what to do with the local database which helps track what’s on the destination. Generally it can be recreated from the destination, but it can be very slow. If you plan to do only occasional remote copies, you might consider copying the matching database so it’s in one tidy just-in-case package.

If you do this a lot, even keeping the destination files in sync may be a little tough, unless you want to copy everything every time. Although lftp does look like it can avoid copying files that are there, it’s not clear how careful it is to make sure the file copy is not a partially written leftover from a prior run that was interrupted.

There are other programs that study files more carefully. Rclone is an open-source option that can do that.

Being too automated runs the risk of some local disaster such as ransomware corrupting the remote files. Having Duplicati directly back up to the remote is safer and also provides two backups in case one breaks. Still, any remote copy is probably better than no remote at all. It all depends on what redundancy you want.