Best practice for Fastest transfer?

Windows 10 PC, Ubuntu 16.04 file server

One of the threads I’ve read here stated that it’s best to push a saveset to a remote location as pulling it incurs more processing overhead. True? I’m currently running Duplicati from my Desk PC to push and pull from the server (each hosts backups for the other. until I can buy a dedicated backup machine).

On another note, I used to use Goodsync to copy files back and forth (no compression/deduping though), and there was a profound improvement in transfer speed using SFTP instead of access the SMB shares on the server. Would I find the same for Duplicati’s process? If so, I’d need to add a copy of duplicati to the server so as to be able to push via SFTP since you can’t pull that way. That also means setting up SSH on the Windows machine, just to do lan backups (no outside connections). I just want to make sure I’m on the correct path before commiting one way or the other.

Hi @mtsgsd, I’m not quite sure what you mean by this. The processing overhead always happens on the machine that is running Duplicati. In your case I suggest you leave it on the W10 PC.

If you are using Ubuntu and Samba to create shares on the Linux box then it should be as fast as FTP or SSH. Samba supports SMB 3.x, so it’s pretty responsive. At the end of the day it all comes to your LAN configuration. If you’re backing up over WLAN than it can potentially be slow, if it’s fixed network like 100Mbps or 1Gbps Ethernet then it should be fast. The first backup (full) will always take quite a long time depending on the amount of data you want to backup, however after the full is complete the incremental should take no more than a few minutes. If you are backing up only to a local LAN than feel free to setup a FTP server if that suits you.

1 Like

I was going to create a new topic about “What protocol be used between mac and Windows for best performance and reliability: SMB or (S)FTP?”, but this topic seems to be very related, so I’m just going to repeat this question here.

I have some mac and windows machines at home. I was able to add a Windows SMB share in mac finder, but was wondering if I should really install a FTP server in Windows computer. Maybe are there technical reasons that I should trust in a protocol that was meant to File Transfers (accurate transfers command, listing, keep integrity, etc)?!

Let’s make sure my assumptions here are correct…

  1. you have some Macs and some Windows machines running Duplicati
  2. they are all backing up to a single Windows machine via SMB share
  3. you are wondering if it would be better (faster) to back up to the single Windows machine via (S)FTP

I’m not sure what you mean by that. Did you mount the Windows SMB share somewhere on your Mac so your Mac backup is using the SMB mount point as a destination?

Yep, correct. I have a Windows 7 PC with a shared folder and will use Duplicati installed on a couple of Macbooks to backup directly on Windows PC. I mounted the share in mac using Finder command connect to server; “smb://win7-pc/backups”.

I have read on the web that for some FTP is faster, but also read that SMB have some advantages of opening remote files without having to download first, etc. So I think this question is really specific to user cases, here of course, specific to Duplicati.

I don’t recall reading one way or the other on performance, but the only time Duplicati should need to open remote files is during testing (normally it tests one random archive file after each backup) and compacting (which normally happens only when enough files / versions have been deleted that it makes sense to re-compress the remaining contents into fewer archive files).

Note that you don’t necessarily have to mount the SMB share on your make to use it as a destination. In Duplicati step 2 (Destination) you can use the “Manually type path” link to allow you to directly type in the share path.

I’m not sure exactly what format you’d need on a Mac, but on my Windows machines I would use \\win7-pc\backups.

Since you’re interested in the fastest transfer and Duplicati doesn’t care HOW the destination is accessed, you could do some backups with SMB for time averaging then change the destination to FTP and do some more and average that time.

I’d like to point out that the SMB drivers in MacOS are terrible. They’re stable but they’re slower than a herd of snails traveling through peanut butter. With a Windows and a Mac both connected over Gigabit ethernet to a Samba server the Mac gets between 1/4th and 1/3rd of the speed of the windows machine in just downloading and uploading files.

I still use it for convenience of accessing some files at home, but I’d never back up over it if I was looking for performance.

I’ve googled for a lot of fixes to the speed issue, and there are suggested fixes, but none of them worked for me.

It might make a small difference, but I’m not sure. Also, you could easily work around this by using FUSE to mount SFTP as FUSE will then take care of stuff like caching file lists. An example of FUSE for SFTP is SSHFS · osxfuse/osxfuse Wiki · GitHub

It should be noted that setting up FUSE is extra work and extra complexity to keep track of, so I would recommend just going with SFTP directly from Duplicati because it’s plenty fast.