Are passwords + server credential meanwhile protected?

I read through “Master password to protect backup passwords and server credentials #558” and some others.

I may be wrong, but my understanding is that the various passwords are not yet protected. Is this correct? To clarify - I am talking about scrambling.

JP

Welcome to the forum @JP801

I believe Windows scrambles the DB unless you set –unencrypted-database on server start line.
On other OS, it seems possible but not likely, as most don’t ship a suitable version of SQLite per
Which tool can open encrypted DB

On a Windows service install, you also get protection from DB being in SYSTEM profile unles it’s
relocated to a better spot to keep Windows from wiping it into Windows.old on version upgrades.

It looks like on Linux, permissions allow only owner access. Of course, attacks could bypass this.

Thank you for your fast answer. Let me first say, who grateful I am, that Duplicati exists and that you all invest so much of your time in it. I really am. It is such a great tool.

But here are my concerns:
We all know, that Trojans these days are getting more and more intelligent. They meanwhile routinely check for clear text passwords stored in config files of e.g. Filezilla and WinSCP. They then uses these passwords to access that data and encrypt it as well. Filezilla is a very good example. The author very long objected denied encrypting the password database with a master password and finally was forced to do it. There is good reason why browsers secure the their database with a master password as well. The option to secure the GUI with a password is already in Duplicati. This will not be a solution for all cases but for many.

Duplicati is meanwhile quiet famous. As an example the German computer magazin c’t from Heise recommended Duplicati in quiet a few articles as ‘the’ backup tools against trojans encrypting data. From my perspective, you can wait - if not already there - that trojans routinely scan the Duplicati db for account data and then access the cloud storage. To bring it to an sharp point (sorry): As long as that can happen, Duplicati is no assurance against trojans, only against technical disasters like disk crash and user failures.

Having said this, please fix this AND as long as it is not done advise your users about the situation. Again I do not want to unpolite or harsh and I very much appreciate all the fantastic work you have done. But from my view that fix should be high on your priority list.

JP

The only solid assurance against trojans is a destination that no program on the computer can destroy permanently. Some destinations (e.g. Office 365) have versioning that could be useful to do a rollback.

The reason for this is that no matter how securely you store destination credentials, at some point they actually have to be used to access the destination, and therefore are available to sophisticated attacks.

Programs can be examined with debuggers at that point, and all of the decrypted secrets can be seen.
This is quite easy. Debuggers are used all the time during program development to look at variables…

DLL injection is a way to get programs to do what one wants, common enough for a Wikipedia article.
Possibly Duplicati’s use of .NET Framework or mono add a level of indirection (it’s not in native code).

Another way to get a program to do what you want is to just replace that program with a changed one.

Passwords can be intercepted through keystroke logging. There are a lot of ways to get credentials…

Once there is an attacker on your system, the system is taken, so I hope backup is safe from system.
Automated malware is probably less sophisticated, but as you note, malware technology is improving.

Obviously an offline backup is pretty safe, but an online backup that doesn’t allow deletes or overwrites almost works with Duplicati if one sets no-auto-compact to stop compact. You also keep all versions…

The end result is that unless there’s a maintenance window to allow cleanup, backup history builds up.
The more subtle issue is that uploads that fail are retried with a different name, and old one is deleted, therefore if a destination gets half a file (likely easy on files, harder on clouds), delete is a noisy failure.
There are several posts on the forum from people who are trying this. I’m not sure how well it’s going.

Similarly, cloud storage vendors seem to be increasing their support for limiting what can be changed.
This is an evolving area because, as you say, it’s a bit of a race between the attackers and defenders.
Some schemes that look good at first glance (e.g. no deletes) may have holes (e.g. allow overwrites).

There are other reasons. Duplicati has a command line version that may run from cron or a scheduler.
This needs to get credentials while running unattended, meaning it can’t ask the user for the password.

As for the GUI password, you’re most likely typing it over an unencrypted (non-HTTPS) connection, so there’s a network packet interception possible. One can set up server encryption, but most users don’t. There’s some difficulty because (unlike SSH), SSL/TLS requires one to arrange for a server certificate.

The list goes on and on. If there are security experts out there with more to offer, feel free to chime in…
For now, I’m still sticking with the idea that if the system gets taken over, a password is not assurance.

It is, though, a step in the right direction if one forces an attacker to hack running programs (which can happen – you seem to follow security, so you can look at some of the exploits that have been around).

Protecting data-at-rest (e.g. the database) has some value. As mentioned, Windows DB is scrambled which actually is encryption (not high quality though), with a fixed key that you can override if you want.
Clear text password stored in Duplicati-server.sqlite shows how to do this, and the future plan from the primary developer who unfortunately seems to be extremely busy – as are all of the too-few volunteers.

Linux is messy, as mentioned in linked posts. One never knows what one will find on some Linux distro. Having Duplicati supply database code of its choice on all the non-Windows systems seems infeasible.

There are lots of things high on the priority list, including reliability. Duplicati is still considered Beta level. Progress depends on volunteers, and there are very very few. You can even see this in the forum posts.
Forum is not an issue tracker. That’s in GitHub issues, and one good one where you can comment is at:

Fix not revealing stored passwords from the UI [$100] #2024 where last input was in 2019, and there’s a post that one person is using a VeraCrypt volume solution (which perhaps asks for a password at boot).
I’m not sure how well that protects the data in all ways, but it should help protect if computer gets stolen.

If you want manual changes, the home page of the manual says how. An issue might also be accepted, however how to phrase it is a question because, as I said, a password is not assurance against attacks.

How One MSP Protects Customers From Ransomware is a new post on the idea, using immutability of server so that even a fully compromised backup client can (hopefully) not destroy their remote backups.

They used a rather complicated design, with about three levels of backup, final being remote immutable.
It’s also not clear how their space efficiency works, or if they just move entire finished backups around…

Duplicati’s entire finished backups can certainly be put elsewhere if you can get an append-only location.

The pains I mentioned of giving up compact and version deletes might also be hidden, and just new files uploaded to immutable remote from convenient local. Basically, do incremental copies without deletions.

rclone copy can even do that efficiently, but not propagating deletes will build up extra files over time that would have to be cleaned up, maybe with rclone sync (which unfortunately needs intact local to sync) or maybe some other way. I’m pretty sure Recovering by using the Duplicati Recovery tool ignores extras. Restoring files if your Duplicati installation is lost is less clear. In either case, some testing is called for…

As ts678 relates, security even for security experts is a very difficult thing and it gets potentially much longer of a read.

Just think of this, because of 2fa, hackers may instead try to gain access to the remote company in an attempt to gain access to the remote servers which may have your files on them. So even 2fa can be meaningless. It is of course still better and a great idea to enable it everywhere possible.

But if you remote your backup then you should be using a heavy password that would hopefully take too many years to crack to be worth it. And that’s assuming there isn’t a security flaw.

I make various local backups onto thumb drives, other computers, and usb external drives only.

The one downside that I currently don’t like with Duplicati local is that an account isn’t really used this way as it has to backup to a network drive for example. Though access to the drive may require a password, it has to always be available. Virus and whatever can just jump through that since its listed as a drive and mess it all up. This is not good. Resilio Sync backup for example doesn’t have this problem as there’s no drive and the application deals with connecting the two points.

So ransomeware could just see the drive and encrypt all those files into a zip or whatever and you would lose access to your local Duplicati files and be unable to restore them. To do that with Resilio Sync backup it would have to be able to scan your network drives or computers, somehow get access since there isn’t any other than through Resilio Sync, then encrypt the files.

But you can see from large companies where they get ransomeware lol that many people are no good at security even when they are being paid and say they are.

Edit - To add for completeness, it is possible to do local and use ssh or ftps for example among others. So Duplicati should already be able to do local in a bit of a better way when doing that :slight_smile:

Each person needs to decide how far to take it. It only takes once and then if its not far enough, bam. Duplicati could maybe do a full security review or maybe even highest security db option or something but user’s kind of also have to decide probably anyway.

1 Like

Absolutely right, that there are many means to hack a system and retrieve needed passwords. And of course, if a skilled hacker wants to achieve access to a selected system, he/she will succeed in most cases. That is by the way the reason, why backups are so important.

The ‘usual’ ransomeware trojans routinely scan certain locations and look for passwords - the easy wins. We know, that browsers, Filezilla and WinSCP are such locations. That is, why these offer to encrypt passwords with a master password.

To store passwords used to access remote storage in plain text is in my view not acceptable. I know that a master password cannot be used in all cases, but in many it is usable and considerably increases cloud security.

Duplicati’s popularity has increased very much due to people understanding the threads and that a backup (strategy) is needed to protect their data. Duplicate is a great solution for use with local (offline) disks. But as cloud storage gets cheaper people want to achieve an additional layer of security and backup there. They invest time and costs to do so.

That is fine and works perfect as long as the backup is used to protect against crashes, data loss and …However in the event of an incident with Trojans they will be frustrated to find out, that their cloud backup happens to be worthless.

If your - the programmers - priorities are different and that kind of security is not on your priority list, fine! It is your time and resources and frankly, we are all very, very grateful for the fantastic work you have done and are still doing. But please, tell people, that for cloud backups against Trojans it may not be the right answer. At least they need to take additional measures. Write it down in a prominent place, so that user know it and can decide, what they needs are.

Aside of that, I think, you have to answer for yourselves the question, what is your target group and what is their need. Hint: I seem to remember, that in one of the forum entries a member offered money for password protection.

There never will be total security, but this should not hinder us to aim for more security step by step. As for my person, my NAS pulls date on a daily base for a versioned backup and I make backups on rotated local (offline) disks. On top of that I used Duplicati for additional monthly backups of the most important data in the external cloud. The last one is now on stake.

@Xavron: Thank you for the hint with Resilio Sync. Will give it a look and I really agree with your points. :smiley:

If you want to appeal to the programmers (I am not one), please file a GitHub issue proposing a change.

Please notice that there are currently 768 issues open. Hang around the forum. Consider helping some. There are far too many demands on far too few volunteers, and adding encryption may be a huge task.
Also note that Windows (likely the most commonly attacked) impedes dumb scans by DB scrambling.

Do the tools with master passwords require typing them in? How might that work in automatic backup? Many tools rely on obfuscation, which isn’t great. It slows generic scans, but falls to sophisticated ones.

In terms of big warning labels (what, a banner on the home screen?) I’m not personally sold on the idea.
I’ve thought specific items such as FTP without encryption might get warnings in the manual-that-many-people-don’t-read (but some do, and at least that’s a good place to warn people away from unencrypted transmissions over the Internet). If you follow this forum, you’ll find that some people try to devise better schemes such as immutable backups (which the client can’t destroy even if it wanted to). I also attempt encouragement of multiple backups (e.g. 3-2-1 idea) when given the chance. What I would support is a general section in the manual on good backup practices – including how to test them – although it’s not exactly a solved problem (there’s yet another priority), and I think far more backups are lost to bugs and environmental damage than to attacks. To some extent, actual problem rates help set what’s worked on.

Because you obviously care about protecting data, would you be willing to help compose a manual text? There is very little resource available there. I don’t control any resources except my own, which tends to disappear into forum work like this. The manual is maybe a bit unbalanced. It talks about the good things Duplicati has to offer, but doesn’t speak much to its not-so-great things, or best practices on actual use.

I don’t dictate things in any way. Please propose to the developers via GitHub issues, or manual’s author
as described. Even better, propose some text. Perhaps the GUI can link to it, for those who care to read. Command line users won’t get it, and I’m definitely against flinging a book at them every command run…

Just note that Resilio Sync has its own problems. I moved on from it to Duplicati. Actually, I moved on from Duplicati local drive/folder to Duplicati local through ssh since an account is necessary and hopefully should be stored encrypted and hopefully only decrypted only for login and UI (haven’t looked into the code for that).

Resilio Sync isn’t bad but its cons are enough that I wouldn’t use it unless there was no other choice. It does make a point about better local usage.