I want to backup my Google drive to another cloud service for redundancy, but prefer not to keep all the files of my drive on my computer. This means I would need to sync one cloud to another cloud. I found a way to do this with duplicati, but am unsure whether this is risky/sustainable.
Having looked into it, I see Duplicati does not support this out of the box. But I seem to have found a way to make it work. I used rclone to mount cloud1 and set up Duplicati to backup the mounted cloud1 to cloud2, a webdav destination.
I used the following to mount cloud1 (Kubuntu 18.04):
sudo rclone mount cloud1: /home/teun/Cloud/Remote/cloud1 --vfs-cache-mode writes --allow-other
Since this is just a test I used sudo because I did not want to edit /etc/fuse.conf.
Duplicati is backing up and uploading with the regular upload speed and has not given me any errors so far. The plan in the end is to set up duplicati on my scaleway vps to do this all for me. Would you guys recommend this solution or advice me to do something else?
If I am posting in the wrong category, my apologies. It did not really fit in anywhere.
There is a rclone back end as well integrated into duplicati2
I wonder is it possible to use it as rclone source to backup c2c
Duplicati is designed around the source files being local to where Duplicati is running. While mounting a remote (LAN share, rsync, cloud, whatever) file location to make it “local” to Duplicati should work just fine you should be aware of the following:
Performance will be VERY dependent on your network speeds as EVERY new / updated file (which is everything on initial backup) will have to be fully downloaded for Duplicati to process before it can then be uploaded to the destination
Because of #1 you may run into bandwidth caps depending on your cloud provider.
If the mounted cloud source is not available when Duplicati runs it will either error our or (if the mount is part of other actual local backups) consider the cloud files to have been deleted. You shouldn’t lose any backups as long as it comes back online for a backup before your retention rules kick in, however when browsing for restores you may find some versions where your cloud files aren’t listed (since Duplicati though they were deleted at the time of the backup).
Remember that thanks to deduplication when the mounted cloud source DOES come back online Duplicati likely won’t have to re-upload everything since all the blocks still exist.
At present the only way to use Duplicati to back up a remote destination is as described above - mount the remote folder as if it were local.
While in theory Duplicati could be updated to use rsync as a direct source, you’d still have all the same issues about performance and bandwidth.