Rewrote the lock on database queries to be simpler and less CPU intensive
Removed some logging details in core performance areas (can be re-enabled with --profile-all-database-queries)
Removed automatic attempts to rebuild dblock files as it is slow and rarely finds all the missing pieces (can be enabled with --rebuild-missing-dblock-files).
Fixed the version number on MacOS builds
Updated the signing certificate for executables and Windows installers
Added a hostname check for the webserver
Fixed an issue where the number of remaining files would be negative
Updated localization files
Now emits a warning if the option is missing a suffix on sizes (b, kb, mb, gb, tb)
Added partially translated Romanian, Swedish, Thai, Hungarian, Slovakian, Catalan, Japanese, Bengali, and Korean to langauges
Fixed a number of issues with --check-filetime-only
Removed the --store-metadata option
Rewrote the query that fetches the previous information for a file or folder. Set the environment variable TEST_QUERY_VERSION=1 to revert to the old version for speed comparison, or TEST_QUERY_VERSION=2 for an alternate version.
Yes, that is a new feature. If possible, you should use the real hostname instead of * as it is possible to do a DNS rebind attack if the hostname is not checked.
Upgrading from 2.0.3.5 to 2.0.3.10 causes my backups to take a huge amount of time. For instance a backup of 158 000 files (273gb) took 1 hour 43 minutes instead of around 2 minutes with minimal changed files.
@kenkendk, what is a “check” command and how would I run one?
--disable-filelist-consistency-checks
In backups with a large number of filesets, the verification can take up a large part of the backup time. If you disable the checks, make sure you run regular check commands to ensure that everything is working as expected.
Default value: “false”
It toggles the pre-backup verification. Prior to running a backup, Duplicati will do a consistency check on the database to check there are no missing or extra items that could indicate database issues. Some users reported that this check was slower than the backup itself, so the option is intended for those who see that issue and prefer a faster backup over the extra validation.
It should probably be “run regular verify commands”.
Ok, then the likely culprit is the new file query. It is supposed to be faster now, but maybe I messed up.
Can you check the results from the last slow backup, and check the “OpenedFiles” count? If this is high (close to “ExaminedFiles”), I have messed something up.
For a more thorough check, you can extract the commandline from the “Export …” → “As commandline”, and then run it from a terminal.
In the terminal you can try this:
export TEST_QUERY_VERSION=1
And then run the backup command again to see if the speed improves.
The values supported are 1 = old version, 2 = experiment, anything else = new version.
Could you try the above as well? There were many changes from 2.0.3.5, but maybe you hit the same issue?
run manually another backup with query = 1, took 26 minutes
run another with query = 3, took just 5 minutes
running again from web gui now, this seems to take longer.
is backing up duplicati sqlite db for a long time while saying progress 100%:
/Users/manderss99/.config/Duplicati/PWLGLZTCDI.sqlite
I have to run but will post back later about how long it took.
Source:
3,21 GB
Backup:
3,91 GB / 14 Versions
Current action:
Backup_ProcessingFiles
Progress:
100.00%
Current file:
/Users/manderss99/.config/Duplicati/PWLGLZTCDI.sqlite
Using TEST_QUERY_VERSION=1 doesn’t result in any change. One of my backup sets has a low amount of opened files (0 or close to) . But another has a high number