Is Duplicati kind of abandoned?

Let me quote a complete paragraph. I’m disagreeing that anyone can do what you suggest that easily, however if anyone is really good, it may be possible. For others, forks can be done to do pull requests.

Creating a pull request from a fork

People aren’t bound by the license, but they might be bound by other things – time, interest, ability, etc.

EDIT:

I would gratefully note that some volunteers are doing development forks and offering unofficial pieces, however to complete the process better, volunteers are needed to officially pick up what people offer…
Yes, someone could do everything themselves if they wanted to (profit?), but that would be more effort.

I’m not talking about pull requests. It doesn’t actually need to be a fork either.

Duplicati can be built by anyone with programming experience and it only needs an installer. As long as it maintains all the same stuff then anything can be added on top and in a way that allows one to travel between. Programmatically speaking, its entirely possible. Eg change the UI would not affect it. Adding fixes would not affect it as long as its done in a way that the other can deal with. etc. etc.

Its not hard at all. Its just programming in general.

The thing is that an installer makes it easy.

Also, projects from git can be updated and merged, etc. So they can work on their own and when Duplicati receives and update it can be merged in. Its not a big deal. Its the whole point of github.

The problem is the ‘commit volunteer’ is really a maintainer (you don’t want a clown that commit anything that come in and walk away when problems happen I guess ?).
As I understand it, you are searching for this for quite some time and so far no one has stepped up, worse, other experienced developers are missing too.
The only conclusion is that @kenkendk is unreplaceable.
It’s all right really, there are only 2 ways : first, drop Duplicati. Second, find a way to make the project go forward without an unreplaceable manager. In my opinion, the only realist way is to do the opposite of what has been done up to now. Currently (until he half retired), the project was the epitome of the bus factor = 1. One person was managing most of the project, doing and supervising everything. The problem is that as soon as this person has some stress to manage outside of the project, the burnout is threatening.
The solution is to replace a single person by a team where stressing problems are divided and shared.
The difficulty is to manage the transition.
Here is how I’m suggesting to do that.
It’s necessary to establish priorities.

  1. the project must not die by abandon. So it’s necessary to get a canary (because the old canary is outdated), then a beta.
  2. the project must start a process of un-kenkendkisation.
  3. when the step 2) is well advanced, and only then, it will be possible to advance again.

So the canary to come must be a minimal refreshment of the current one, without any unnecessary risk, because it would be a distraction of the task 2), the next priority.
My suggestion is to commit the library updates (Newtonsoft, MegaNZ, Storj), and the zero risk changes. Afaik, there are 3 PR with 0 risk in the pipeline, one change in the file filters (a no code change), and my 2 experimental PR (they are conditioned by a environment variable, if people want to take the risk to try them, they will be aware of the risks)

Now it’s not very inspiring, how to be able to advance ? More slowly that has been the case, but some advance could be possible even while lacking a full blown experienced Duplicati developer. First I could do a quick pass to rule out obviously bad stuff. I don’t remember having seen any in the existing PR, but it could have been because of my poor understanding of course. Second and more importantly, tests by volunteers could be decisive. If say, 3 persons outside of the PR submitter have tested the change and reported that it don’t eat the cat and that even if make Duplicati work better, the PR could be adjudicated in.

Now, to do the testing, one needs a binary unless one is a developer. Obviously doing binaries on demand does not scale. It would go straight to a bus factor of one.
That’s what I have tried to remedy these past days.

Here is the project (that’s a proof of concept):

https://github.com/gpatel-fr/testdupl/actions/runs/4305844249

You’ll notice the presence of a ‘artifact’ link (it’s clickable). It allows to download a zip with Windows installers.

The ‘testdup’ is a copy of the current Duplicati trunk, where I have replaced the automated tests by a test+build process. For any PR entered in the system, binaries are generated (in this test repo, they are configured to last 5 days).

At the moment, only Windows binaries are built, I have not yet gone further.
It should be relatively easy to build Linux (at least Debian) binaries. I just had not yet the time to do it. About Macs, there is the theoretical possibility. First problem for me is that I’m have no clue about Macs and that even if I manage to output Mac binaries, I could not even test them. It’s not in my project to buy a Mac and learn this stuff. Second problem is that AFAIK running untrusted binaries in Apple OS is difficult - and the necessary secrets will not be in Github repo. Not an impossibility, but difficulties for me. I will do what I can in the following days, but no warranty of quick success either.

Ultimately I don’t know if the misery that is Github actions will ever be able to manage every build for Duplicati, but in the short term it don’t matter, only the big three are really useful for enabling volunteers to help in evaluating the PRs. If volunteers don’t help in PR testing even having binaries freely available, it would unfortunately spell the end of Duplicati, unless some genius is stepping up to do the impossible.

Going back to the second priority, the way I have done the Github action to build the Windows binaries matters a lot to me. It’s a script called by the Github action, and crucially it can be run locally. It’s the opposite of the current system, where the build script is published but can’t really be run by anyone but the project author (it’s linked to the computer configuration, and can’t be run on a computer that is not a Mac). Every person wanting to build and test Duplicati should be able to do that on the local computer (Windows, Mac, Linux) exactly like it’s generated on the host server (Github). I’ll pass on Docker since it’s not a dev environment by definition. This should be a major goal to maximize the chances of getting more help. That’s what would be most of my effort if I take the committing task, much more than reviewing PR. Also, the Experimental PR that I have done are part of that: it’s necessary to rid the code of some very difficult to read parts if new help is to be found.

Finally about the PR advocated by @dgileadi : I agree that tests are very important. However, they are less important than the priority two: turning the project into a team effort. And the failing tests are the Appveyor ones, that are AFAIK only used for code coverage. The Github tests are used for the same unit testing. IMO a greater priority would be to make the Github tests work reliably (sometimes the Mac tests fails, sometime the UI tests are failing).
The change by tz-il is very interesting, I have even spent 2 hours in a Appveyor VM to try to make sense of it, but ultimately I would not take any risk in committing a change that has no clear rationale and don’t fix a pressing breakage.
If enough volunteers test it, or if some clear explanation is found, that would be another story then.

It was a bit looong winded, but that’s it. This is my proposal. If you agree, you ask @kenkendk to grant me commit rights. Once it’s done, I’ll commit the necessary PR to refresh the current canary without taking risks.
Basically you are the release manager - since it’s not possible yet to release without him, you ask him to generate the canary when you think it’s time, and then the next beta. But I’m not going to commit all PR that you want, because I think that’s not a priority to enhance the product now, it’s already very good as it is. Only breakages are more important than going forward with a new approach.

3 Likes

I’ve gone ahead and done this: Release Test build for native macOS menu bar icon · dgileadi/duplicati · GitHub

Thanks everybody, I never expected seeing this extensive discussion and loads of information in response to my fairly simple question. Nobody seems to dare to give a straightforward answer though, so I try to sum up for myself:

Yes. Duplicati is in more or less abandoned, because the main contributor @kenkendk has retired from working on the project and due to the fact that some major tasks necessary to create new builds have been tailored to be done only by him, it seems rather unlikely that somebody else, may it be a single person or a team, will be able to step in on short term. This is especially sad because there are in fact some dedicated volunteers having PRs and fixes ready, only right now there is nobody able to glue the pieces together.

Sad, but true. It’s a backup software, and it’s being used (not only, but also) by people without any IT background who just need to protect their data. So, reliability is crucial - which will suffer from aging and not being maintained any more (and already does).

This is not the answer I was hoping to get, but this is what, as far as I understand, the discussion boils down to.

1 Like

He said he has no time at the moment, but hopes to return (date unknown).
If you read that as “retired”, feel free, but even “away now” will delay things.

He said he’s keeping the release making. Commits were not solely his area.
Someone else (now inactive) was helping on some (maybe not the trickiest).

Above is picking on summary tone, but we’re obviously in a slow period now.
How long will it take to regroup a bit? Probably we’ll have to see how it goes.

I’m pretty sure that limited volunteers in limited areas can help, so volunteer.
Nice would be a larger set of people to call on for specific talents they bring.

In the short term, we have who we have, and they do what they can for now.
@gpatel-fr has reported some marvelous work. I’ve long wanted test builds.

Supportability of Newer MacOS might be worth a look, both for macOS work
and for some sort of scheme of having people run artifacts. Compare notes?

@dgleadi might want to take a look at the above, and maybe search issues.
IIRC there was at least one person with some Macs that they could test with.

In terms of next releases, that might need a little more thought. Often there’s
a discussion among the staff (look for prior ones), but I like the general ideas.

Lower risk PRs and PRs fixing things that have broken over time are primary.
I’d also like reliability fixes, but sometimes PRs aren’t there yet. Anyone else?

2023 release planning topic in Developer category is now open to hopefully gather some followups.
Releases don’t just happen. It should be clear now that it takes many people. Help out somewhere.
Even non-developer users can help others (freeing up developers) or help on test, docs, and so on.

Note that I have now generated my own build,

https://github.com/gpatel-fr/testdupl/actions/runs/4332749394#artifacts

but I don’t know if it works :frowning:
Maybe you or someone having a mac could test the dmg ?

Hello,

Sort of stumbled in here because of the same thing when I found out my servers haven’t been backed up for a month due to VSS Service hanging, and duplicati hasn’t emailed me about that error (But does with other errors).

Don’t need to convince me that free time is limited but has duplicati developer(s?) ever thought about taking the software commercialy?

My business uses it because it’s a perfect middleground between an enterprise solution like VEAAM and, well, nothing.

We’d be happy to pay a subscription for it.

feel free to add an issue to the Github project:

as interim maintainer, I don’t feel that it should be addressed in next update as a priority, but if it can be confirmed with enough details to pinpoint the cause, it would be interesting to address it.

I will add to my list to try to reproduce it and make an issue when I have a 100% way of reproducing.

I’m a garbage coder, but even I know the pain of hunting down intermittent bugs.

hmm…I’ll go to set a VM with the same OS, then install Duplicati with a test backup, then try to make VSS to fail. Usually it is failing by itself, but in this case it would be needed to make it fail on purpose. Maybe find a library involved in a VSS provider and delete it :slight_smile:

Welcome to the forum @guemi

In addition to an issue (or at least a forum topic), workaround might be in third-party monitor such as
Duplicati Monitoring that knows expected backup schedule and independently sees that it’s followed.

Discussed by original developer recently here, basically talking about the risks of starting a business. Thanks though for being another voice that speaks to the idea of the missing middleground problem, which would be another good discussion topic. As you may note, I’m trying to keep this a bit on-topic.

I’d love to talk more about both ideas, if you’d care to move them. If you do, we can edit here to point.

Fantastic.

Will try our and will add it to our FOSS donation plan this year in addition to Duplicati.

My company built our SaaS apps on FOSS so we donate to 1-3 projects every year to give back, but obviously a reoccuring income is neccessary if one should be able to live of it.

I have unfortunately not written a single line of .NET code in my life, and would be useless in that regard. Can’t say I have an abudance of cash either (But will happily use company money :wink: )

But what Duplicati is lacking for me as a sysadmin / DevOps Engineer is a centralized server so I don’t havbe to go to our 12 web portals every morning to double check that backups have ran properly.

If I were to monetize Duplicati I would start by offering that behind a pay wall - and later perhaps offer a free version that’s good enough for small scale, and a “pro” version for better needs with included support.

I DO HOWEVER have compute hardware and datacenter space in abudance (Once again that company cash) - perhaps I can take a load off @kenkendk’s back / costs by hosting that for free?

1 Like

Donations has information. Anything helps, but it’s not clear it’s enough for anyone to quit a day job. Open source (especially GPL) makes it tough to monetize. Some value-add might offer a path, BUT

Duplicati has an ecosystem of free or donation based third party monitoring tools that fill some spots.
I don’t want to get in deep in that, but the point is that monetization faces challenges of free software.

Anyway, something to think about to see if money is possible. For now, Duplicati relies on volunteers.

Thanks for the hosting offer. I don’t know what kenkendk actually sees here, but I’ll keep that in mind. Some of the low-cost hosting now used is a bit flaky. If we find test people, they would need gear too.

The user community eventually hits them but can’t provide good debug info. Formal beating on lots of systems equipped with good logging, etc. would be one way to help track elusive intermittent issues…

Once again, volunteers are needed. We need to recruit a broad base of community support somehow. There are lots of different components to making a software product. There’s a lot more than coding…

The solution I came up with for this is to have Duplicati report to my Zabbix monitoring server every time a job runs. In addition to logging both successful and failed jobs, it will alert me if too much time goes by in between completed backups. This is useful in cases where Duplicati ends up “stuck”, and neither succeeds nor fails at the job.

I have found in my experience as a sysadmin, that “Nothing happened” can be even more dangerous than an outright failure, as it tends to not trip any alarms.

You mind sharing how you did it? We use PRTG and I was thinking of doing the same thing, but I’d rather not have to reinvent the wheel if you’ve already came up with a good solution.

@guemi

Duplicati can report the success / failure of a job through http. You could try to use a PRTG http push and set the scanning interval to fit the backup plan.

The search box at the top of the screen shows 5 PRTG topics. Maybe open another, if that’s needed.

I’m not sure I would call my solution good. It works, but was cumbersome to set up.

I wrote a batch file that is called by Duplicati’s run-script-after option when a job runs. This batch file gathers some information from the environment variables that Duplicati sets when it runs and submits them to Zabbix by the zabbix_sender application. I then wrote a Zabbix template that accepts the inputs and populates the various monitors in Zabbix.

If the run-script-result-output-format setting is set to JSON, it will also submit additional data that I can use to track other things, such as the size of the source and backups, size of uploads and downloads, and various other helpful pieces of data.

I wish I could have Duplicati submit the data directly to Zabbix via http, bypassing the need for a batch file and run-script-after setting, but Zabbix doesn’t support accepting inputs that way.

I’m not sure how this would translate to PRTG, but I’m sure a similar solution could be put together.

1 Like