Duplicati - Dropbox Backup - Short lived token expires describes the usual flow, and next release.
Unfortunately details of next release are TBD, partly because Dropbox now fails for new backups.
For broadest fix to users, this should wind up in some sort of Beta. Most people won’t run Canary.
That’s the first public test of new fixes and features, and sometimes new issues become visible…
Looking at Canary release history will show when an emergency respin of Canary has happened.
Experimental has recently been used before Beta to help make sure upgraders get no new issues.
The autoupdater is not very reliable (needs some expert help), plus sometimes DB updates break.
126.96.36.199_canary_2021-08-11 is the only Canary since 188.8.131.52 Beta (slower than usual due to few volunteer developers), and you could click on the
75 commits to master since this release
which shows what went into the
master branch since then. Your SQL work is currently the newest.
Releases are announced in the forum’s Releases category, although I expect many don’t read that.
You might note that, especially for Canary, a release note tries to get fairly detailed. Beta is less so.
If you’re OK with interpreting commit history, as it turns out there’s a release manager opening now.
This would let you help set the pace, helping to decide what’s worthy of shipping, and in what form.
A somewhat parallel issue to code fixes is how to get Duplicati on a newer
.NET 5/6/? foundation.
Merge master into Experiment/net5 split #4683 is me talking about releases, leaders, and FreeBSD.
That’s maybe faster than things normally go (with possible exception of a maybe-hotfix for Dropbox).
Things such as personnel and processes change over time though, and improvement is a good goal.
Though it’s imperfect (and volunteers are needed to improve it), it does run 4 million backups/month.
There’s a need to keep them going. Also, if a Canary breaks, it impacts daring takers. Beta breaking potentially harms not only production of backups, but could also hurt backups that can’t be replaced.
It will be interesting to see how the Dropbox fix releases. It fixes new Dropbox backups, but changes authentication code in both the installed Duplicati and the cloud-resident Duplicati OAuth Handler, so potentially affects any of the services listed there. How much broad risk to take to fix Dropbox faster?
There has historically been a wish to speed up the release cadence, but it takes volunteers to do that.
There have also been some strategies around what gets in when, avoiding big change before release.
One challenge with having an indeterminate release schedule is one doesn’t know when to slow down because there’s not really a grand plan. All of this could be improved if a release manager gets active.
This would surely help, and possibly some upgrade testing (including from quite old releases) will help.
Many people don’t build from source, but a one-off from the proven-and-aging Canary might be a step.
Things that have never been in any Canary are quite a bit untested, but maybe yours could be moved.
I’m not sure of the limits of the build and release tools, but maybe test current-Canary-plus yours first?
This is the kind of release planning that needs to be done. It needs a leader, but also staff discussions.
The only person who knows exactly how this happens (probably some scripting) might be the current release manager, who is looking for a replacement. The role is somewhat customizable, I would think.
Mentioning Git is good because sometimes (and maybe more in the future), code needs to be moved.
Please consider what you (or any other takers – lots of openings) could do to help Duplicati’s success.
I definitely like your willingness to jump in to make things happen, and I’ll say that for @tsuckow and dotnet project as well. Different people have different interests, leading to different roles. Shape yours.
Under test there are automated unit tests, and it kind of stops there in a methodical way, but there are people asking about performance testing. Much is unknown, but the database is one known limitation.
There have been a few proposals for SQL improvements, but ones that turn into pull requests are few.
Thank you for showing the initiative on that and for following up on releasing fix as you are doing here.
For shaking out some of the annoying reliability issues that the forum gets, test beating Duplicati hard.
I’ve recently enhanced the test setup on this PC. I had a profiling log and DB history (just
Sometimes I need destination history, so I added
rclone sync from cloud with
That preserves the deleted cloud files, and because backup file names are random, they won’t collide.
Now I can run a test faster and cheaper on local files, and stop Duplicati fast if my test error comes up.
There’s my view of some of the things that could be done, but very little will happen without volunteers.
This is a community project, and if anyone is interested in seeing it progress and succeed, please help.