Best Practices for Securing s3 Credentials in Duplicati From Foreign Actors?

I am running Duplicati as a service. Admins are in the admin group and users are in the users group in Windows.

Curious as to how I would go about doing this via ACLs in Windows. Am I locking down the database file and only allowing the SYSTEM user to make changes to it?

I found a few posts on here from 2017 explaining how the passwords are secured but I’d like to know the current state of how the credentials are stored and protected. I’m probably wrong here but I vaguely remember reading something about how the database is encrypted with a weak default password in Windows?

Assuming an attacker bypasses the Duplicati web UI password and they somehow get a hold of the s3 credentials from the Duplicati database they could just use an s3 browser and wipe the files on the remote side. If I lock down the files on the remote side making them immutable then Duplicati can’t do it’s thing and prune the archive when it needs to. The only thing I can think to do is let the backup grow exponentially and not keep XYZ versions. I can’t remember if Duplicati prunes or doesn’t at that point by just slapping new block files to the backup location.

How are you guys mitigating against this?

Doing what? There’s no prior objective stated in the post. If you mean the topic title, then your service is running as SYSTEM by default (which is dangerous because Windows version upgrades wipe out files) therefore you get some protection from file access of ordinary users but not from elevated Administrator.

The Duplicati-server.sqlite file is also weakly encrypted with a default password which you could change. The purpose of the scrambling is to impede simple string scanners and maybe SQLite database readers (because unfortunately for legitimate usage, not many tools know how to deal with this encryption at all).

So this bypasses file permissions and also encryption, and there’s not a separate login for administrator, although that would be a nice enhancement. I think there’s a request, but not enough volunteers to help.

Basically, Duplicati trusts its user. It will also show secrets to them in Export As Command-line because they have to be passed to command line somehow. parameters-file can at least keep them a bit private.

It will also use such secrets when going to S3, because S3 needs to authenticate it. A system owned by some sophisticated attacker is kind of lost because they can simply get its secrets out using a debugger.

This is how people try to save backups of a completely compromised system. There are forum posts on that, however it’s difficult and set-it-yourself. If Duplicati never deletes, backup gets huge and gets slow. Keeping all versions and never compacting is the main path to this, but upload retries can be a concern.

S3 compatible providers sometimes support object lock, but Duplicati like many programs doesn’t use it. There might be a web page somewhere of some that do. I can think of two, one expensive, one far less.

There are other kind-of-awful schemes one could do, such as periodically snapshot regular destinations into immutable ones. If you do that, snapshot of Duplicati local database at the time would ease restore. Immutable need not be truly immutable, just immutable against anything that the client system could do. Limited-time immutability could also be achieved by object lock, I think, but it’s a fairly complicated area.

It really depends on what you’re trying to protect against. Some people do file-level security of database by encrypted filesystem. That can help protect stolen laptops, but there are lots of attacks it doesn’t stop.

From the sounds of it, I’m thinking you plan to install Duplicati on users workstations to backup the users local data to an S3 account that you don’t want users to see? So long as you define the jobs and users don’t need to modify them you should be good with the following.

-First off, no tray icon. Out of sight, out of mind. With that also remove all the Start/Desktop menu entries.

-Second, you’re already running as a service which is great but be sure to move the config files from the default location, otherwise your config can get wiped out following certain Windows updates. See this thread for further details.

-Thirdly, remove any user permissions from the Duplicati folders. The service should still be able to access anything it needs and when logged in as an admin everything should be normal. User should get an Access Denied or a click continue to get permission followed by an elevation prompt when trying to access anything within those folders.

Once removed users won’t be able to access C:\Program Files\Duplicati 2. This prevents them from seeing or copying anything directly out of that folder but (and it’s a big but), they can still get to everything via the webGUI (because it’s running as a service). The only thing you can then do then is enable the --webservice-password or find another way to prevent users from accessing localhost:8200, maybe a firewall.

If suppose you could just use defined CLI jobs (stored in a folder users can’t access), scheduled by something else like Task Scheduler maybe.

If this is a file server vs local workstation then user access to the local Duplicati files shouldn’t be an issue (i.e. no need to change file permissions), just specify the --webservice-interface=loopback on the service to keep webGUI access to those who can login locally, presumably only the admins.

With all that said, Duplicati is still in Beta and far from hardened for these sorts of usages so test, test and test some more.

Hope this helps a bit.

1 Like

I’m still confused by the topic title on “Foreign Actors”, but I did forget to mention that only localhost is enabled by default. One has to work to enable non-local access. Be real careful if actually on Internet. Firewall rules with a small hole for authorized access might work, as might using a more Internet-safe access method such as an up to date SSH or equivalent Windows facility. Securing the entire system properly is more than Duplicati can do. Other practices include password non re-use, and limits to the access that a given password can get, e.g. to storage. If one system gets compromised, don’t risk all.

I sometimes wake up at night, drenched in a cold sweat, petrified that Ewan McGregor has stolen my S3 credentials.

All joking aside, I’m not too sure either but I think we’ve laid out most of the options.

1 Like