I just installed duplicati for the first time. I am trying to backup the data located on a remote Linux server, using SSH/SFTP. So I add a new backup, configure the destination, which is a local path, and now I’m trying to configure the source of the backup. I tried this:
If you’re willing to get a local copy of the files from the remote server, rclone sftp could maintain that.
Duplicati could then back that up, and add the value of maintaining a compact multi-version backup. –run-script-before plus other scripting options could be used to build preparation into job’s definition.
Whether or not performance is tolerable might depend on how much data needs to be downloaded.
File modification pattern may matter too. Rclone deals with entire files. Duplicati backs up in blocks, however it needs to read through the updated file to find the changes. Local file scans are far faster.
I understand what you’re saying, but if I had a local copy, then that’s my backup.
Currently I am using a combination of bash scripts with rsync and mysqldump to backup multiple websites locally, and I wanted to see if I could somehow do this without scripts, but with a nice web interface.
If you want just one local copy, then all the extra work Duplicati does goes to waste. It’s not a sync tool, however sync tools do exist. If you like rsync (which solves the file-read and transfer-size issues well):