Release: (beta) 2018-11-28

The major changes in this version are:

  • New multi-threaded processing engine
  • New logging system with filter options
  • Better external reporting, with JSON support
  • Filter groups to exclude common unwanted files
  • Ignore filenames and empty folders added
  • USN support on Windows
  • Improved repair and validation
  • Fixed reading password from console
  • UID and GID now correctly restored on Linux/BSD/MacOS
  • Added a number of new languages to the user interface

And of course many, many, other updates and fixes contributed by the Duplicati community.

A big thanks to all who contribute fixes, report issues, and participate on the forum!


Congratulations on releasing a long-awaited beta :slight_smile:
With OneDrive Microsoft Graph API now well supported on the latest beta (including two factor auth), I will be able to finally jump off the canary channel.

1 Like

I think it may be likely that users who are updating from the beta and also using Chrome will encounter Chrome: Empty status bar starting with The posted solution to that (which also worked for me) was to clear Chrome’s browser cache.

This user said the update wouldn’t go at all - had to over-install.

Nice work! Out of curiosity is beta equivalent to experimental?

I was going to check on github but for some reason I can’t figure out how to compare source between two different tags/releases. Need to research that a bit more…

It looks like code is basically equivalent, but some build issues were fixed that may or may not affect you.…v2.0.4.5-

and the same could be said for experimental versus which is the stated base.…v2.0.4.2-

Release: (experimental) 2018-11-08 might have gone beta, but a Docker build bug needed fixing.

1 Like

Thank you! I recall seeing a “compare” button in github before but couldn’t find it today. Those links worked great.

So what would be the things to pay attention to when updating from the previous beta ( ?
That version runs stable, but has some problems with setting the symlink parameters (so an upgrade would be nice to get rid of these errors)

Did anything change in the send mail behavior? (as I use dupReport)

Is it best to update via the web interface, or just do a reinstall over existing?

And finally, I might have missed this, but is file change tracking now used on NTFS systems, or is a full scan still performed at every backup?

That’s a valid question - these #releases posts generally only detail changes from the last version regardless of update path so you don’t see a full list of if everything changed from the last beta (or experimental).

This page should provide a summary of changes for multiple releases. Just read from your current version up to the one you want to install to know all that changed. :slight_smile:

That being said, as far as I recall, no changes happened in send mail.

Yep! USN (Update Sequence Number Journal) is the NTFS Change Journal that can now be tapped for what’s changed instead of needing a file scan.

Yep! USN (Update Sequence Number Journal) is the NTFS Change Journal that can now be tapped for what’s changed instead of needing a file scan.

… under the condition that Duplicati is running with Administrative privileges. So in practice, this probably means you need to install and run it as a service.

My thinking was that most poeple using the beta would be confused by a huge list of changes, so I manually create a curated list of “highlights” for the experimental and beta releases.

The full list is also in the changelog: duplicati/changelog.txt at master · duplicati/duplicati · GitHub


That makes sense to me - and thanks for the link to the txt file. Maybe we should try to include that (the link) in not-canary release posts.

A “for a complete list of changes see…” link would make it obvious the Release post is just a summary and simplifies letting those who want more info get to the details.


I think there could be an issue with the latest release - or it’s me. But I went to do a “system” backup (I have separate jobs for logs, DB, system, and email on a mail server - CentOS 7) and noticed that it was backing up the sqlite files under /root/.config/Duplicati/. Backup of these files did appear to error out:

Log data:
2018-12-03 13:38:26 -06 - [Error-Duplicati.Library.Main.Database.LocalBackupDatabase-CheckingErrorsForIssue1400]: Checking errors, related to #1400. Unexpected result count: 0, expected 1, hash: elCQ7j+8fWDBZo+E9d6Vml58NRsFml+GZ+fNu4rQt74=, size: 102400, blocksetid: 319219, ix: 18, fullhash: YkKv7TqwdBuLs9dadt6ccZcTymUY3xZjncnybGIl4pc=, fullsize: 1968224
2018-12-03 13:38:26 -06 - [Error-Duplicati.Library.Main.Database.LocalBackupDatabase-FoundIssue1400Error]: Found block with ID 12318357 and hash elCQ7j+8fWDBZo+E9d6Vml58NRsFml+GZ+fNu4rQt74= and size 37304
2018-12-03 13:38:26 -06 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: /root/.config/Duplicati/VQAOCWDRRT.sqlite-journal
System.Exception: Unexpected result count: 0, expected 1, check log for more messages
  at Duplicati.Library.Main.Database.LocalBackupDatabase.AddBlockset (System.String filehash, System.Int64 size, System.Int32 blocksize, System.Collections.Generic.IEnumerable`1[T] hashes, System.Collections.Generic.IEnumerable`1[T] blocklistHashes, System.Int64& blocksetid, System.Data.IDbTransaction transaction) [0x002fb] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 
  at Duplicati.Library.Main.Operation.Backup.BackupDatabase+<>c__DisplayClass13_0.<AddBlocksetAsync>b__0 () [0x00000] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 
  at Duplicati.Library.Main.Operation.Common.SingleRunner+<>c__DisplayClass4_0`1[T].<RunOnMain>b__0 () [0x00000] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 
  at Duplicati.Library.Main.Operation.Common.SingleRunner.DoRunOnMain[T] (System.Func`1[TResult] method) [0x000b0] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 
  at Duplicati.Library.Main.Operation.Backup.StreamBlockSplitter+<>c__DisplayClass2_0.<Run>b__0 (<>f__AnonymousType13`3[<Input>j__TPar,<ProgressChannel>j__TPar,<BlockOutput>j__TPar] self) [0x00a24] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 
  at Duplicati.Library.Main.Operation.Backup.StreamBlock.ProcessStream (CoCoL.IWriteChannel`1[T] channel, System.String path, System.IO.Stream stream, System.Boolean isMetadata, Duplicati.Library.Interface.CompressionHint hint) [0x0012c] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 
  at Duplicati.Library.Main.Operation.Backup.FileBlockProcessor+<>c__DisplayClass1_0.<Run>b__0 (<>f__AnonymousType8`2[<Input>j__TPar,<StreamBlockChannel>j__TPar] self) [0x002b3] in <c6c6871f516b48f59d88f9d731c3ea4d>:0

Perhaps related, but I have/had a “filter” which should exclude “/root/.pcloud/” but I found that the backup today appeared to include files in this directory in the backup and I don’t believe it did under Now, I’m not 100% sure but in case it was the trailing “/” causing an issue, I’ve changed the filter to be “/root/.pcloud” but have not run another backup to see if this makes any difference. If it doesn’t, it might be something like exclusion filters aren’t working properly under - that would explain why it might be attempting to backup the duplicati sqllite databases.

Now, to complicate things slightly - I also upgraded mono today to (from so the “system” backup will be larger, but this is the first time I’ve ever seen errors regarding the backup of the duplicati databases…

Can you check your restore options and see if the Duplicati databases appear in backups from before the update to

(Note that I edited your post by adding ~~~ before & after your error text to make it easier to read.)

Thanks for the cleanup. Yes, I do in fact see both /root/.config/Duplicati/*.sqllite and /root/.pcloud in my backups from a month ago so the filters were not working beforehand. The filters are configured as this when viewing as text:


Perhaps this is my bad, but I thought “exclude folder” would exclude the contents of the subdirectories. (I do not see Hashbackup in the list of restore files so that works - however, that path is a symlink to another partition which is excluded.)

Are you able to post your job exported “as command-line” (or at least the filters including the parameter name)?

--backup-name=System_B2 --dbpath=/root/.config/Duplicati/VQAOCWDRRT.sqlite --encryption-module=aes --compression-module=zip --dblock-size=100MB 
--retention-policy="1W:1D,4W:1W,12M:1M" --threshold=65 --small-file-max-count=3000 --disable-module=console-password-input --exclude=/root/.pcloud/ --exclude=/root/.config/Hashbackup/ --exclude="*/pcloud/*/"

Oh, I also ran another backup of the System - it also failed again on the sqllite files, but this time the backup took 17 minutes when normally the backup takes around an hour.

Well everything looks OK to me - and you are correct that EXCLUDE should ignore the folder and all subfolders.

Is the folder actually called .pcloud or pcloud (no dot-prefix)?

under /root/ it’s .pcloud:

[root@voyageur centos]# du -sh /root/.pcloud
5.2G    /root/.pcloud

I have the 3rd exclude there to exclude the mount point, /pcloud and the sources.

Another potential issue/difference I’m seeing in - the space used by /root/.config/Duplicati/ has doubled - I now have twice as many sqlite files:

[root@voyageur Duplicati]# ls -l *sqlite
-rw-r--r-- 1 root root 7761055744 Dec  3 18:02 ATQJGACUOV.sqlite
-rw-r--r-- 1 root root 7761055744 Dec  3 11:00 backup 20181203060001.sqlite
-rw-r--r-- 1 root root      83968 Dec  3 11:36 backup 20181203113621.sqlite
-rw-r--r-- 1 root root   10265600 Dec  3 02:45 backup 20181203123747.sqlite
-rw-r--r-- 1 root root   22885376 Dec  3 00:20 backup 20181203123805.sqlite
-rw-r--r-- 1 root root  787825664 Dec  3 02:06 backup 20181203124427.sqlite
-rw-r--r-- 1 root root   10265600 Dec  3 12:38 CZFNGQMCYW.sqlite
-rw-r--r-- 1 root root      83968 Dec  3 19:10 Duplicati-server.sqlite
-rw-r--r-- 1 root root  806103040 Dec  3 19:10 VQAOCWDRRT.sqlite
-rw-r--r-- 1 root root   22885376 Dec  3 12:38 ZIKDBEFESL.sqlite

Can I delete the “backup YYYYMMDDHHMMSS.sqlite” files if I don’t intend to go back to