Release: 2.0.4.29 (canary) 2019-09-17

2.0.4.29-2.0.4.29_canary_2019-09-17

  • Added workaround for bug in Mono 6.0, thanks @warwickmm
  • Extra logging during database recreates, thanks @drwtsn32x
  • Added options to MSI installer for parameterized installs, thanks @BlueBlock
  • Improved debug building, thanks @BlueBlock
  • Updated packages: FluentFTP, CoCoL, Microsoft.Azure, AWSSDK, MailKit, MimeKit, thanks @BlueBlock
  • Improved progress bar status, thanks @drwtsn32x
  • Fixed a parsing issue when reading the server path in the UI, thanks @FlyingFox333
  • Code quality improvements, thanks @warwickmm
  • Added code to actually remove purged volumes, thanks @BlueBlock
  • Updated bundled GPG and checking for user-installed GPG on Windows, thanks @BlueBlock
  • Improved handling of the “Stop after current file” method, thnaks @BlueBlock
  • Updated list of S3 locations and storage classes, thanks @kenkendk

Edit: The MacOS PKG and DMG files are notatized for Gatekeeper. Please report any issues with this.

2 Likes

Looking good… upgraded 4 of my machines to this release, no issues.

1 Like

This version contains a database upgrade. Rolling back to a previous canary or beta version is not recommended.

Looks good for me, and also the new MSI parameter works a treat so should make updating my machines a little faster- thank-you @BlueBlock

Do you mean for upgrades from the Beta release?

Edit: this is the change: https://github.com/duplicati/duplicati/blob/master/Duplicati/Library/Main/Database/Database%20schema/10.%20Add%20IsFullBackup%20to%20Fileset%20table.sql

There appears to be… I upgraded from 28 to 29 and Duplicati created backups of my job-specific databases.

I did the same, no backups created on any of my machines both Windows and Linux

Strange… I’m showing a version bump from 9 to 10:

root@stimpy [/root/.config/Duplicati]# sqlite3 LYKQSUOLIN.sqlite 'select * from Version;' 
1|10
root@stimpy [/root/.config/Duplicati]# sqlite3 backup\ LYKQSUOLIN\ 20190917053203.sqlite 'select * from Version;'                 
1|9

Yes very, as mine on Windows and Linux all show as version 9 still.

And you did run a backup, correct? I believe job-specific databases are not upgraded until a backup job is run.

Ah that could be it - I upgraded after I knew most backups had run, so I will check that out

I confirmed with a backup that ran 15min ago, that it is indeed updated to v10.

Thanks for helping to clear that up.

It will upgrade the db when a backup is run.

This is total hunch, but was something changed with the de-duplication / cleanup code? Because I did update six systems today, with this version. And observed three of those doing much larger than normal upload amount and also three systems did immediately after that compact.

Sure, it could be coincidence, especially the compaction. But the large upload quantity gave kind of impression that maybe something wasn’t updated earlier that should have been. Or maybe something was re-uploaded for some reason? Also the systems in this case are all completely separated, so it wasn’t about some data set change, which would have affected multiple systems. - Finally saying it again this is total hunch, just a strange feeling. Just wondering if anyone noticed something similar. - Just getting paranoid. No worries. - If there would have been serious issue of course the automated restore testing would have showed it.

Check the full log of the backup job to see how much was uploaded, and if it was a result of a compact operation.

I didn’t notice anything unusual myself.

Yes. Deleted remote volumes stick around in the database for a while to fix issues where the backend reports the files as existing even though they have been deleted. Before this update, those files would continue to stay in the database, but they will now be purged from the database.

I do not see a case where it should have the effect that you report, but maybe @BlueBlock has a better idea of how the fix works?

1 Like

@drwtsn32 @Taomyn Sorry for the confusion, I somehow overlooked the database upgrade. But yes, this is the update:

1 Like

In Windows 10 1903 I use Duplicati as a service. I upgraded Duplicati to version 2.0.4.29 yesterday.
And after completing the first backup in the new version, I found that Duplicati.Server.exe consumes the processor to about 45%. I have a 4 thread processor.
I don’t know what the process does when no backup is running.

Here is the status from Process Explorer:

There are multiple report of the CPU issue from multiple people on multiple OSs. I hope someone with the right tools (perhaps a profiler?) can narrow down what’s going on. Meanwhile, I suggest care, because the database upgrade in this release will make it difficult to downgrade to 2.0.4.28 because the backup DB file gradually gets stale if too much time passes. Its name these days is like backup.random-letters.date.sqlite.