Newbie questions : windows

Hi all,

I’m new to Duplicati and to backup in general. Due to ransomware infection last summer which caused real disaster (over 2 TB home photos and videos lost), I begun to look for a solution… and finally found Duplicati a few days ago. I began to test it : it works flawlessly !

Then the first thing is a very big thanks to the developpers who give time for stupid users like me !

I’m using sharepoint libraries to save data. For some reasons I don’t know, I can’t connect to my account with Duplicati with the sharepoint v2 feature, it works fine with the basic feature.

I find that initial backups take a very large amount of time : speed is only about 5 MB/s and my internet bandwidth allows much more. Speed doesn’t seem to be limited by processor speed (the same with a 10 years old intel quad core ad with a quite new i7 processor. The sharepoint host seems to accept much faster connections, as I get the same speed on each computer when backing up two different machines simultaneously.

First question : is there a way to optimise upload speed ?

Second, I saved my backup tasks in files. What if I use them on another computer ? Or on the same conputer after reinstalling OS due to crash ? I guess I should change path letters of source folders when changing computer… Is that all the changes that need to be done ?

Thanks again for this great software and for taking time to answer such newbie questions.


PS : I’m using Duplicati

Hi @Jean-Christophe_Kehr, welcome to the forum!

First answer:
Yes, there usually are ways to optimize speed but they vary depending on system setup and what part you are trying to optimize. Note that the initial backup is almost always “slow” and subsequent ones should happen much faster.

That being said what are you trying to optimize? Total speed? Least amount of destination space used? Least amount of local system performance (so other tasks aren’t slowed down)? use all your internet bandwidth?

Second answer:
You should use a single backup DESTINATION FOLDER on one computer at a time. You can copy (export / import) your backup jobs to other machines, but be sure to change the destination folder so it’s not the same as another one of your backups.

If you don’t, you’ll get warnings about extra or missing destination files when you try to run your backup and then be prompted to run a database Repair. If you’ve got two different jobs pointing to a single destination folder and you run a database Repair, you’ll most likely break one of the backups while attempting to fix the other.

Other than that (destination folder) nothing else NEEDS to be changed.

Thanks for taking time to read and answer !

I would like to optmize upload speed. It’s currently around 5 MB/s and my internet connection (tested) allows more than 50 MB/s. I would like the backups to use half this bandwidth, but, for some reasons I can’t find, I don’t manage to do so. Of course, first backup is much longer, but subsequent ones are shorter only because less data is uploaded, the upload speed is the same.

I use Duplicati on a small test environment : a server running windows server as DC, DNS, DHCP, WDS, VPN and file server (about 5 TB data) and 4 workstations (windows 10 x64) on the same network.

I need to backup about 4 TB data and encrypt them before storing them in sharepoint libraries. It’s easy to calculate the time needed to upload 4 TB at 5 MB/s ! So optimizing upload speed is a need for me. I’m sure I’m not limited by processor power, I get the same upload speed using a 10 years old quad core and a recent i7 processor.

I divided the whole backup into smaller tasks. I get 5 MB/s speed on each workstation using them simultaneously to achieve different tasks on remote files. This suggests a limitation on each computer, then maybe a parameter in Duplicati. The size of original files varies between 700 MB and 10 GB, so I modified the size of the uploaded files from 50 to 500 MB, I found this more relevant, however it didn’t change the upload speed.

Concerning my second question, my first explanation wasn’t clear. As I get the same speed backing up from different machines remote files simultaneously, I thought I could create tasks on different worstations on my network to get the initial backup done quicker, and, once the initial backup is done, import these tasks on the server to run the next backups, not so often and with far less data upload needed.
Is the file export/import feature usable to achieve this ?

Thanks again for taking time and sorry for my english, it isn’t my native language.