Plans for next release?

I curious if there is any plans/timeline for a release.

I identified some performance issues in the database and found some indexes to fix them.

I noticed that those query performance issues were not uncommon. They were an edge case though because it required a large and many file dataset. But I feel like that is a pretty common use case.

After initially testing it I was just going to try a different backup solution due to the issues I was experiencing. Because of that I feel like these indexes probably should be applied as a minor change to one of the release branches so new users don’t experience the same issues I was, potentially driving them away.

I am willing to do the dataset testing for any extra validation we feel would be needed to further test the indexes. I also don’t mind making the new release branch with git.

Let me know what you think!

I just know I will be using these indexes going forward so I will probably be creating my own build and release based on the latest beta to make that easy so I figure I could do the work for the project.


Duplicati - Dropbox Backup - Short lived token expires describes the usual flow, and next release.
Unfortunately details of next release are TBD, partly because Dropbox now fails for new backups.

For broadest fix to users, this should wind up in some sort of Beta. Most people won’t run Canary.
That’s the first public test of new fixes and features, and sometimes new issues become visible…
Looking at Canary release history will show when an emergency respin of Canary has happened.

Experimental has recently been used before Beta to help make sure upgraders get no new issues.
The autoupdater is not very reliable (needs some expert help), plus sometimes DB updates break. is the only Canary since Beta (slower than usual due to few volunteer developers), and you could click on the 75 commits to master since this release
which shows what went into the master branch since then. Your SQL work is currently the newest.

Releases are announced in the forum’s Releases category, although I expect many don’t read that.
You might note that, especially for Canary, a release note tries to get fairly detailed. Beta is less so.

If you’re OK with interpreting commit history, as it turns out there’s a release manager opening now.
This would let you help set the pace, helping to decide what’s worthy of shipping, and in what form.

A somewhat parallel issue to code fixes is how to get Duplicati on a newer .NET 5/6/? foundation.
Merge master into Experiment/net5 split #4683 is me talking about releases, leaders, and FreeBSD.

That’s maybe faster than things normally go (with possible exception of a maybe-hotfix for Dropbox).
Things such as personnel and processes change over time though, and improvement is a good goal.

Though it’s imperfect (and volunteers are needed to improve it), it does run 4 million backups/month.
There’s a need to keep them going. Also, if a Canary breaks, it impacts daring takers. Beta breaking potentially harms not only production of backups, but could also hurt backups that can’t be replaced.

It will be interesting to see how the Dropbox fix releases. It fixes new Dropbox backups, but changes authentication code in both the installed Duplicati and the cloud-resident Duplicati OAuth Handler, so potentially affects any of the services listed there. How much broad risk to take to fix Dropbox faster?

There has historically been a wish to speed up the release cadence, but it takes volunteers to do that.
There have also been some strategies around what gets in when, avoiding big change before release.

One challenge with having an indeterminate release schedule is one doesn’t know when to slow down because there’s not really a grand plan. All of this could be improved if a release manager gets active.

This would surely help, and possibly some upgrade testing (including from quite old releases) will help.

Many people don’t build from source, but a one-off from the proven-and-aging Canary might be a step.
Things that have never been in any Canary are quite a bit untested, but maybe yours could be moved.

I’m not sure of the limits of the build and release tools, but maybe test current-Canary-plus yours first?
This is the kind of release planning that needs to be done. It needs a leader, but also staff discussions.

The only person who knows exactly how this happens (probably some scripting) might be the current release manager, who is looking for a replacement. The role is somewhat customizable, I would think.

Mentioning Git is good because sometimes (and maybe more in the future), code needs to be moved.
Please consider what you (or any other takers – lots of openings) could do to help Duplicati’s success.

I definitely like your willingness to jump in to make things happen, and I’ll say that for @tsuckow and dotnet project as well. Different people have different interests, leading to different roles. Shape yours.

There’s a need for volunteers in many areas such as C#, .NET *, UI, Javascript, DB, test, docs, forum.

Under test there are automated unit tests, and it kind of stops there in a methodical way, but there are people asking about performance testing. Much is unknown, but the database is one known limitation.
There have been a few proposals for SQL improvements, but ones that turn into pull requests are few.
Thank you for showing the initiative on that and for following up on releasing fix as you are doing here.

For shaking out some of the annoying reliability issues that the forum gets, test beating Duplicati hard.
I’ve recently enhanced the test setup on this PC. I had a profiling log and DB history (just copy/move).
Sometimes I need destination history, so I added rclone sync from cloud with --backup-dir option.
That preserves the deleted cloud files, and because backup file names are random, they won’t collide.
Now I can run a test faster and cheaper on local files, and stop Duplicati fast if my test error comes up.

There’s my view of some of the things that could be done, but very little will happen without volunteers.
This is a community project, and if anyone is interested in seeing it progress and succeed, please help.


The indexes I worked on mainly targeted performance issues on a newly created database being filled. Once the database is done being built after the first backup this bit of code (Run PRAGMA optimize when closing database connection by warwickmm · Pull Request #3749 · duplicati/duplicati · GitHub) helps the database performance a lot.
I will dig a little deeper into the restore process and see if there are any improvements to be made on the database side.