Any updates on the plaintext password security problem?

The locally installed Duplicati has to use the passwords given, unmodified, on remote locations. Therefore they have to be available in clear text. I don’t think there’s even a theoretical solution to this.

It is still an issue.

Correct!

If the service requires a password (or token) to log in, we need to store that token in a manner that allows us to pass it to the server (i.e. we need at some point to pass it in plain).

The solution I would like to see (but have not found time to pursue) is to use the built-in keychain features found in all major OS’s.

The keychains are (IMO) the best bet on safely storing credentials, and will use hardware support where available, and generally rely on the users credentials to reduce snooping from root/admin users.

A real solution that works now (and with keychain support) is to have a non-privileged remote user that can only create new files. That way an attacker can steal the credentials (and read the remote files, as well as the local) but not compromise the server or stored backup. You need to set --no-auto-compact and --keep-versions=0 to avoid Duplicati attempting to remove files, as that will fail. You can also disable verification, if you want to disable reading remote files as ell, but I do not see the benefit in that.

Duplicati will never overwrite a file, so you can always remove that privilege from the remote user. If you want retention, you need to have some kind of “soft delete” on the server where the files are moved somewhere else after being deleted, and then kept for “some time” until they are permanently deleted.

I know this does not work with all storage providers, but it cannot be fixed client-side as an attacker with full system access can bypass anything done client-side.

2 Likes

The locally installed Duplicati has to use the passwords given, unmodified, on remote locations. Therefore they have to be available in clear text. I don’t think there’s even a theoretical solution to this.

I get the technical challenges. A root problem is due to 1) there is no trusted third party involved, and 2) there is no Duplicati program on the receiving end that can better process commands from the sender.

But the issue still exists regardless.

A real solution that works now (and with keychain support) is to have a non-privileged remote user that can only create new files.

Oooo, this sounds like what I’m after. I’ll need to try and set this up.

The “receiving end moves instead of deletes” is a smart idea, as that solves another issue of an attacker changing your Duplicati settings to delete files.

Defense against trojans gets into the append-only idea. It doesn’t do much to prevent reading your backup, however if the attacker is already on your system, they can read your original files. Other options include to have Duplicati run as a Windows service as the SYSTEM user, and an attacker that can only get in as you won’t be able to access the Duplicati databases (assuming you’re a Standard User, not an Administrator). Avoiding password reuse is, as always, a good idea so as to limit the damage if something gets revealed.

Although you seem to be talking about an attacker on a live system, full drive encryption can prevent some other hazards such as losing a laptop and having all your secrets just sitting there in clear text on the drive. Less heavy than full drive encryption might be to use encryption (tied to your login) on the databases folder.

Windows Encrypting File System can do this. How to encrypt files and folders in Windows 10, 8 or 7 gives more information on this, and also on BitLocker (for FDE). Note that Windows Home lacks both of these… There are other Windows solutions around, and Linux has its own set (presumably most OSs have some).

This thread has been great.

Although you seem to be talking about an attacker on a live system

Don’t mind if an attacker reads my files. In my case, currently I’m using SSH keys, so a compromised account on the source machine allows very easy read/write/delete access on the backup machine.

I just want to prevent delete access on the destination machine. I want a backup that doesn’t just back up files, but also ensures an unforeseen event (like a compromised machine) doesn’t ruin my backups. :slight_smile:

Avoiding password reuse is, as always, a good idea so as to limit the damage if something gets revealed.

Doesn’t matter how often you change up passwords, they’re available plain text in Duplicati’s SQLite’s database (and they have to be). Even an encrypted hard drive on the destination doesn’t fix this problem if the user has delete writes.

To clarify, my “password reuse” point meant for different needs or sites, allowing more risk if it gets taken.
Companies that produce software to manage passwords like to point this out, but I think the risk is real…
52% of users reuse their passwords
Password Reuse Abounds, New Survey Shows

They don’t have to be stored in plain text, but they need to be convertible to that. As stored, Windows gets an obfuscated (weakly encrypted with fixed password) database unless you use --unencrypted-database. About Duplicati Security describes this. It’s a small hurdle. Ultimately a skilled attacker can get credentials unless something else stops them (such as running Duplicati as root and hoping they don’t get that far…).

The encryption suggestion was for the client system to limit credential theft, e.g. if the system gets stolen. Unfortunately I couldn’t quickly see anything for Linux like Windows EFS at a folder level, which was how I thought possibly you could get benefits similar to the Linux keychain on Windows Professional and above. Linux seems to have full drive encryption solutions. I think the issue with most of these is access controls after unlock. If you get access, an attacker who’s indistinguishable from you will probably also get access.

Security is hard, but perhaps somebody will think of small hurdles for Linux to make attacks a little harder.

And don’t forget macOS…

macOS got mentioned in GitHub issue 2024 from the original post here, and that will be a better reminder, however there is one person on the thread proposing that Linux come first if a universal way doesn’t exist. :wink:

Using the OS X Keychain to store and retrieve passwords was one search result to look at access by CLI which possibly could serve Duplicati command line, possibly even Server or Tray Icon if information could pass from initial launch somehow without making too big a security exposure. Use of the macOS security command itself seems subject to use being legitimate, and also faces operational difficulties of prompting.

A couple of questions:

--no-auto-compact

Command line help says:
–no-auto-compact
Performs a compact process after purging file

Should that have said “Does not perform a compact process…”?

--keep-versions=0

The GUI says: Use this option to set number of versions to keep, supply -1 to keep all versions
Default value: “0”

To be clear, I’m using the smart backup retention policy. But I read that and keep-versions is exclusive. So suppose I didn’t have smart backup policy and turn on --keep-versions…

Would --keep-versions=-1 be the right answer here to ensure any attempted delete doesn’t occur? Otherwise, wouldn’t --keep-versions=0 mean I don’t keep any versions, so every time it updates a new change, duplicati tries to delete the prior version?

This has been fixed in source and will be part of the next build:

The code for deleting is checking for KeepVersions > 0 so both 0 and -1 works.

1 Like

Perhaps @brad could clarify what help was asked for. The help for no-auto-compact is fine even in 2.0.4.5, but help for purge seems to show the mentioned issue (even in source code master branch).

C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe help no-auto-compact
  --no-auto-compact (Boolean): Disable automatic compacting
    If a large number of small files are detected during a backup, or wasted
    space is found after deleting backups, the remote data will be compacted.
    Use this option to disable such automatic compacting and only compact
    when running the compact command.
    * default value: false

C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe help purge

Usage: purge <storage-URL> <filenames> [<options>]

  Purges (removes) files from remote backup data. This command can either take
  a list of filenames or use the filters to choose which files to purge. The
  purge process creates new filesets on the remote destination with the
  purged files removed, and will start the compacting process after a purge.
  By default, the matching files are purged in all versions, but this can be
  limited by choosing one or more versions. To test what will happen, use the
  --dry-run flag.

  --dry-run
    Performs the operation, but does not write changes to the local database
    or the remote storage
  --version=<int>
    Selects specific versions to purge from, multiple versions can be
    specified with commas
  --time=<time>
    Selects a specific version to purge from
  --no-auto-compact
    Performs a compact process after purging files
  --include=<filter>
    Selects files to purge, using filter syntax




C:\Program Files\Duplicati 2>

Thanks!

That makes sense now. :slight_smile:

A question for you, I want to contribute back to this project a bit. Would it be ok if I wrote a documentation page or two in the manual’s articles (here) on setting up SSH keys and Duplicati for backups to an SFTP backend? Additionally, a security section describing how to configure to prevent unwanted deletions? I ask because it took me a few rounds of reading and thinking to get auto backups functioning to a clean, secure state.

I’d just write it up, then do a git pull request for review.

Perhaps @brad could clarify what help was asked for

My goal is simple, configure the SFTP destination to allow read/append access, but prevent overwriting existing data or deletion. Suppose the source machine is compromised, then the attacker cannot compromise the backup.

1 Like

Contributions are warmly welcomed. Code, documentation, testing. Anything goes :slight_smile:

1 Like

Any help to improve the documentation is welcome!

If you want to explain how to configure a specific operation in Duplicati, posting it in the #howto section of this forum is your best choice.

If you want to improve something in the manual (better explanation, fix a typo), just do a PR at Github.

If you want to add some technical information about how the software works that’s applicable to any backend, please add an article to the Articles section of the manual and do a PR.

My guess is that a step-by-step guide on how to configure SSH/SFTP and how to configure blocking tampering backend files fits best in the #howto section of the forums. There are some great documents with comparable content in that category (example).

But you’re the author ofyour files, if you think that your docs relate to general use of Duplicati (instead of a specific part, like a particular backend), feel free to submit a PR on Github!

1 Like

I’ll try to get something written up this week. Also, I like to play well in the sandbox, so I’ll start with the #howto approach in the forums.

As a heads up, I never knew the #howto even existed until you pointed it out. I went straight to the online documentation first, and explored the manual first, and articles second. I’ve always consider forums for people solving issues, rather than for FAQ guides. I bet others have done the same approach as me too.

Maybe it would make sense to add links to howto articles in the documentation.

That way users of the documentation will be guided the right way and we won’t fill the actual documentation with information that’s not strictly duplicati information.

How does this look? SFTP/SSH backups to a Linux server with added security

1 Like

Looks good, thanks for the article!

Kenkendk, I’ve got a question. I set up a solution that forbids overwriting but allows appending. Last night I realized appending allows a hacker a vector of attack, just append some garbage bits on the end of a .zip.aes duplicati file, and Duplicati chokes.

Are you saying that appending rights aren’t needed, and that with no-auto-compact, every 1 file on the source gets 1 file on the destination?

No source files “becomes” a file on the destination since files are split into chunks to be uploaded.

Any file “chunks” that need to be uploaded are bundled into zip files. Any zip file will always have a unique name.

Duplicati doesn’t understand the concept of appending to files. It can only put, delete, get, and list.