Create a backup of my backup?

Hi,

I am probably one of the later CrashPlan refugees here, only 60 days to go there and now I am looking for a replacement.

One feature that I am sorely missing is multiple backup destinations, I know I can define multiple backups individually, but this generates a lot of extra comparison work on the clients so I was wondering if anyone runs a backup of the duplicati backup files.

The scenario I have in mind is clients backup to my server and my server backs up the backups to 2 cloud providers (I frankly dont trust to keep my files at one single cloud provider so I’d go for two)

The client backups would run duing the day, lets say 7AM-1AM hourly, the server -> cloud backup would then run at 2 AM daily. I know that if I ever need a recovery from the cloud I’d have to download the entire backup set. But are there any other disadvantages besides this? Does anyone run a setup like this?

I’m looking for a solution on this.
I’m looking for a solution like a mirroring program ( to mirror my duplicati backup folder to the cloud).
The other way is using rsync on the backup server to another server.

Anybody have more ideas?

Is possible to use Duplilcati to make a backup of my duplicati backups? Can’t find how to make this…

Thanks.

FWIW,

I use rclone to sync my backups to the cloud - and between cloud-drive1 and cloud-drive2 just-because.

This ensures on-prem and off-prem backups of the same

Is there a reason you use rclone instead of duplicati to make these copies? (To prevent a double restore? in case of disaster?)

I am also thinking about monitoring here, what if cloning fails silently? Do you report on this?

Hi @boran_blok,

Purely to use another cog to do the duplication and remove duplicati from that step alltogether. You can of course choose to backup both locally and to cloud on the client-end as well - or have duplicati use rclone (although I’ve not done that).

As for syncing my nas-backups to cloud - the rclone sync gives exit codes so yes I monitor on that + size/filecount of local copy vs remote copy.

Thanks linhead

I’ll test it.

From the previous thread I found there was talk about ransomware risk. what would happen in the scenario where the duplicatie backup files on your local server would get corrupted and then synced to the cloud?

Keep versions on the cloud itself as well?

Yep - it’s a good point - nothing beats some cold storage anyway I figure.

I’ve not played enough on google drive (or any other cloud storage really) - but it would be nice to do something like ‘mount this gdrive with how it looked 2 days ago’ since things are versioned anyway (i believe).

Since Duplicati itself is able to do versioning, even if ransomware were to get on the original computer, you could restore pre-ransomware files directly from Duplicati (assuming your retention policy hasn’t pruned them).

The biggest issue for me with automated sync is if something deletes the “direct” Duplicati backup, then the removal of those files gets synced out to your secondary backup causing you to lose BOTH backups.

If you’re using something like rsync and have it set to run with a --run-script-after command that verifies the Duplicati backup ran successfully, then this issue is mostly handled since when Duplicati runs if it can’t find it’s own destination files then the backup will fail (thus not running the post-backup script).

I have one backup on my NAS (in raid1), 1 backup on an external USB and one on Backblaze B2.