2nd beta version of Duplicati 2.0 now available


Hello everyone!

Our second beta version is now available at duplicati.com. A complete list of changes can be found at Duplicati - Changelog on Github. Here is a summary of the major changes:

We introduced a smart retention policy. This means that backups are retained based on their age. The default for Duplicati’s retention policy is the following: There will remain one backup for each of the last 7 days, each of the last 4 weeks, each of the last 12 months.

New backends have been added. Duplicati now supports Sia and rclone. That means, Duplicati now supports way over 20 backends as well as five standard protocols to transfer your backup to any storage system.

This version also got quite some performance improvements: Listing files during restore process is now faster. Clean-up after backup is now faster. Hashing of data blocks is now faster.

And there have been many more smaller changes. About 40 “things” have been added, 70 “issues” have been fixed, and much more “stuff” got updated, improved, or changed.

All your data are belong to you!


(429) Too Many Requests
Duplicati - Incremental Backup Settings
pinned globally #2


I see the message for beta: “Don’t use with important data.”. May be time to remove this text or at least move to Canary?


I had some trouble updating from an earlier 2.x beta: at first I got error messages and no working UI, tried to downgrade again to retain function, then UI could not connect to the server. Back to 2.3.3 and then the app would not start at all w/out error messages. Then wanted to try to uninstall and reinstall, hoping that the backup-jobs would be kept, but tried “repair” first and that made 2.3.3 work. Strangely the app is stated to be at in the win10 app manager but now clearly is 2.3.3 and working flawlessly.

(Win10 x64)

Is there a “best practice” when updating between betas?


Nothing running, so I’m guessing I have to delete my existing scheduled backups. Not fun.


Not really - there were a number of update changes / improvements made between the and betas. Unfortunately, for SOME users this caused issues like you experienced. It happened once on a Windows 10 x64 machine and I just downloaded the full installer from Duplicati and “over installed” on my old version (no uninstall, no backup job loss).

You shouldn’t need to. I believe that message just means that a current task (such as a backup) is running so it can’t activate the new version or the current task would fail.

I’d suggest checking if a task is running and, if so, let it finish then try activating again. If nothing is running then you may have found an upgrade bug, but you still shouldn’t have to delete any backups. You could try “over installing” from the downloads page (it’s worked for me in the past) and if that doesn’t work, you can export your existing jobs to files before a full uninstall / reinstall then import the files and you shouldn’t loose anything from your jobs (though maybe some global settings - I’m not sure).


There’s a year old bug that restores don’t work with non-standard block sizes. Non-standard block sizes are very useful in some scenarios. Inability to restore seems like a fairly big problem for a backup tool. Are there any plans to fix that bug at any point? I stopped using Duplicati due to that bug, because I couldn’t trust it to restore my data.

split this topic #8

A post was split to a new topic: Duplicati beta ate my settings (MacOS)


It took a while, but I’ve finally been able to set smart retention policy in the new beta. Initially the setting wasn’t saved, it went back to previous setting whenever I exited options dialog and entered again. Then, I found that setting “Keep all backups” was the only choice being saved. Finally, setting “smart retention” using incognito mode seems to have done the trick, and now I can see the setting in backup options. I hope it’s actually applied too, let’s wait to next backup execution…


Smart Retention looks great! Yet another item off my Duplicati wishlist…


It looks like it is a parsing/logic issue. If you look at the comments, you can fix the problem by setting the block size manually (and remember to put a B at the end so it is not parsed as MB).

That said, we really should fix small issues like that, unfortunately, my time is limited.

split this topic #12

5 posts were split to a new topic: 2 instances Windows service portable mode after update


Is there going to be updates on the experimental channel bring safe fixes (such as the symbolic link fix) without the riskier changes (such as multi-threading)?

Or is another beta coming soon?

Basically, how can I get the latest fixes without the risks of running canary?



I had not planned on that, but I guess that we could cherry pick some of the minor changes into a new experimental build. Any volunteers for creating a branch with fixes for an experimental?


Ken, I spent quite a bit of time trying to do a restore, but I couldn’t work out how to do it. I’d really like to start using Duplicati again, but “cannot restore” bugs are a big red flag for me that I shouldn’t trust my data to that backup tool.


I agree. Making a backup that cannot be restored is pointless.

For the year old bug, the solution is to supply --blocksize=1048576b.
Note the trailing b, otherwise it is treated as 1048576 KiB => 1 GiB which is too big a blocksize.

That said, I do have multiple unittests that test with odd blocksizes and they work without any problems, so I suspect that the issue is that the GUI somehow injects a new blocksize, causing it to not accept the first read blocksize (from the manifest, inside the zip archives).


I fixed it:

It turned out that it had worked, but after I introduced some faster listing of data, it broke.