Release: 2.0.3.10 (canary) 2018-08-30

2.0.3.10-2.0.3.10_canary_2018-08-30

  • Updated the help text for no certificates found, thanks @jonmikeiv
  • Fixed logging details, thanks @mnaiman
  • Fixed error messages on repair, thanks @mnaiman
  • Refactored the FTP backend, thanks @verhoek
  • Rewrote the lock on database queries to be simpler and less CPU intensive
  • Removed some logging details in core performance areas (can be re-enabled with --profile-all-database-queries)
  • Removed automatic attempts to rebuild dblock files as it is slow and rarely finds all the missing pieces (can be enabled with --rebuild-missing-dblock-files).
  • Fixed the version number on MacOS builds
  • Updated the signing certificate for executables and Windows installers
  • Added a hostname check for the webserver
  • Fixed an issue where the number of remaining files would be negative
  • Updated localization files
  • Now emits a warning if the option is missing a suffix on sizes (b, kb, mb, gb, tb)
  • Added partially translated Romanian, Swedish, Thai, Hungarian, Slovakian, Catalan, Japanese, Bengali, and Korean to langauges
  • Fixed a number of issues with --check-filetime-only
  • Removed the --store-metadata option
  • Rewrote the query that fetches the previous information for a file or folder. Set the environment variable TEST_QUERY_VERSION=1 to revert to the old version for speed comparison, or TEST_QUERY_VERSION=2 for an alternate version.
  • Improved UI status messages, thanks @lucascosti
  • Failing to add a file will now give a warning instead of stopping the backup
  • Removed a hot-item cache for VSS
  • Added option --disable-filelist-consistency-checks to allow speeding up large backups
  • Now ignoring ENODATA error message when reading metadata on Linux/BSD
  • Added additional support for exit codes in --run-script-before to allow stopping the backup or emitting a warning
  • Fixed an issue with Google Cloud Storage, thanks @warwickmm
  • Improved the B2 username field description, thanks @xfakt-pj
  • Removed some unused code, thanks @warwickmm
  • Improved source code documentation, thanks @mikaelmello

Hi,
after updating to 2.0.3.10 I get the following message when trying to access web interface:

The host header sent by the client is not allowed

tried FF and chrome, running ubuntu 18.04

Edit:
after acessing then interface through 127.0.0.1 and adding * in hostnames (nre feature?) I can access the interface remotely again.

2 posts were split to a new topic: Duplicati not automatically restarting after update on MacOS

Yes, that is a new feature. If possible, you should use the real hostname instead of * as it is possible to do a DNS rebind attack if the hostname is not checked.

What would this do if enabled.

Does it mean that during backup progress real time file listing is enabled or disabled. Some thing like it.

Upgrading from 2.0.3.5 to 2.0.3.10 causes my backups to take a huge amount of time. For instance a backup of 158 000 files (273gb) took 1 hour 43 minutes instead of around 2 minutes with minimal changed files.

1 Like

Does that extended time persist across multiple runs or just the first one after the upgrade?

@kenkendk, what is a “check” command and how would I run one?

--disable-filelist-consistency-checks
In backups with a large number of filesets, the verification can take up a large part of the backup time. If you disable the checks, make sure you run regular check commands to ensure that everything is working as expected.
Default value: “false”

After upgrading to 2.0.3.10 yesterday, tonights 2 backups which usually takes about 10 minutes took 2 hours each.

Running on Mac osx high sierra.

1 Like

It toggles the pre-backup verification. Prior to running a backup, Duplicati will do a consistency check on the database to check there are no missing or extra items that could indicate database issues. Some users reported that this check was slower than the backup itself, so the option is intended for those who see that issue and prefer a faster backup over the extra validation.

It should probably be “run regular verify commands”.

Which version are you upgrading from?

from version 2.0.3.9.

It takes that long every time, not just the first time. Recreating the database doesn’t help either.

Ok, then the likely culprit is the new file query. It is supposed to be faster now, but maybe I messed up.

Can you check the results from the last slow backup, and check the “OpenedFiles” count? If this is high (close to “ExaminedFiles”), I have messed something up.

For a more thorough check, you can extract the commandline from the “Export …” → “As commandline”, and then run it from a terminal.

In the terminal you can try this:

export TEST_QUERY_VERSION=1

And then run the backup command again to see if the speed improves.
The values supported are 1 = old version, 2 = experiment, anything else = new version.

Could you try the above as well? There were many changes from 2.0.3.5, but maybe you hit the same issue?

ExaminedFiles: 38726 OpenedFiles: 446

 Messages: [ 2018-09-01 05:00:00 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started, 2018-09-01 05:30:35 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: (), 2018-09-01 05:30:45 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (175 bytes), 2018-09-01 06:57:48 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b83d62cd31219400a85b825e6089fc670.dblock.zip.aes (49,93 MB), 2018-09-01 06:58:01 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b83d62cd31219400a85b825e6089fc670.dblock.zip.aes (49,93 MB), ... ] Warnings: [] Errors: []

run manually another backup with query = 1, took 26 minutes
run another with query = 3, took just 5 minutes

running again from web gui now, this seems to take longer.
is backing up duplicati sqlite db for a long time while saying progress 100%:
/Users/manderss99/.config/Duplicati/PWLGLZTCDI.sqlite

I have to run but will post back later about how long it took.

Source:
3,21 GB
Backup:
3,91 GB / 14 Versions
Current action:
Backup_ProcessingFiles
Progress:
100.00%
Current file:
/Users/manderss99/.config/Duplicati/PWLGLZTCDI.sqlite

Using TEST_QUERY_VERSION=1 doesn’t result in any change. One of my backup sets has a low amount of opened files (0 or close to) . But another has a high number

ExaminedFiles: 139355 OpenedFiles: 108050 

All of them are slow.

from web gui it took 1.25h

Same problem with slow backups (used to take minutes, now takes hours) with Windows 10 running as a service.

Is there a way to specify the TEST_QUERY_VERSION in the web interface?

I got the same problem with huge increased backup time. It seems that the new query is the culprit. With

TEST_QUERY_VERSION=1

speed is back to normal. Running on Windows 10. Tested on 2 PC’s.

Probably not other than using a --run-script-before parameter to set it in a batch file.

You could also set it for the Duplicati user general environment with standard Windows 10 environment settings.