Add secure and httpOnly attributes to cookies

Security validation fails on the duplicati web ui because the xsrf-token cookie is missing the httpOnly and the secure attribute.
Can this be fixed?

Hello and welcome!

Which security validation are you referring to?

Hello,
if you run a security scan with Greenbone (was openvas) on a host exposing duplicati, the scan engine will complain about two vulnerabilities:

I was able to play with an haproxy in front of duplicati and inject the two attributes to the cookies. While everything is ok when I inject the secure attribute, when I set the httpOnly attribute the connection with duplicati breaks, so I came to the conclusion that probably duplicati does exactly what the httponly attribute is meant to avoid: “If the HttpOnly flag (optional) is included in the HTTP response header, the cookie cannot be accessed through client side script”.

Can you confirm this? Do you think you will try to solve this issue?

Thanks, Stefano.

The main author of Duplicati advised not to expose the web UI to untrusted networks. It’s not hardened for that. If it’s not locked down to localhost, you should only access it by way of a trusted network.

Some have experimented with putting it behind a reverse proxy but I haven’t personally tried it (as I don’t have a need). I also don’t know much about xsrf-tokens so I can’t help with your questions. Maybe someone else on the forum can.

This is definitely bad news, we need to expose the duplicati web UI to the internet.
We’ll probably have to rethink our application.

If you restrict its access by source IP, it might be ok.

For reference here is the post where the main author mentions this:

You can get in through some hopefully well-hardened (though there are always people trying to break in…) method such as SSH using port forwarding so that a local address talks to remote localhost:8200 or such.

If you don’t like that way to tunnel the (by default unencrypted) web traffic in, there are probably other ways.

Could you implement a VPN connection into that network, completely avoiding the need to directly expose Duplicati?

What we are doing is the following:

We offer a web based sync&share service on a private cloud, each user/project can easily deploy her/his own service via a PaaS dashboard.

We need to back-up the database and the configuration on a remote S3 storage, so the duplicati backup service comes up automatically with the sync&share as an auxiliary service, as well as a nagios-based monitoring service.

Users must be able to manage their backup autonomously, they need to be able to easily access their own backup endpoint. Adding a VPN would be overkill and would make the user experience more complicated.

I’m not really comfortable putting the Duplicati web UI on the Internet or another heavily hostile network. Nothing has been proven one way or the other though, beyond the scan – more thoughts on that below.

Please obtain some good security advice (this is just a user forum), but IMO you would want the system software to be well-updated and heavily battle-proven against Internet attacks. You would also want the assurance of protection against brute-force password guessing (for example, rate-limited lockout), want multi-factor authentication, and so on. Of course it depends on the value of the data that you’re keeping.

Passwords: Our First Line of Defense is a pen tester talking about weak passwords. That’s an issue too. People also tend to reuse passwords, so while passwords are a first line of defense, they’re not enough.

I certainly don’t understand the whole system, but the same issue might apply to the sync&share service.

You might be able to make up for some weaknesses by front-ending with a more secure captive portal or similar authentication device – something that you might use to raise security for some legacy web apps.

If any part of this is shared by several users, there’s also the question of security between different users. Duplicati configuration must be done carefully in order to keep separation at both source and destination. Duplicati is meant for a not-trusted destination, not a not-trusted user (or malware) on the source system.

A scanner is IMO a generic tool to look at limited things and raise potential flags that then need followup from security experts, in conjunction with experts on the system being tested (none available, but I’ll try).

Did you also test using Duplicati in default HTTP (not HTTPS) configuration? It seems like it would break.

I think it’s protecting against cross-site request forgery with a Cookie-to-header token design. Wikipedia:

The CSRF token cookie must not have httpOnly flag, as it is intended to be read by the JavaScript by design.

This technique is implemented by many modern frameworks, such as Django[24] and AngularJS.[25]

Duplicati uses AngularJS. Files below show some of its cookie and header work. Please search for xsrf:

Code:

https://github.com/duplicati/duplicati/blob/master/Duplicati/Server/webroot/ngax/scripts/services/AppService.js

https://github.com/duplicati/duplicati/blob/master/Duplicati/Server/webroot/ngax/scripts/angular/angular.min.js

More:

Cross Site Request Forgery (XSRF) Protection (AngularJS)

Cookie-to-header token CSRF protection

Why is it common to put CSRF prevention tokens in cookies?

https://github.com/duplicati/httpserver showing history of code.

https://github.com/duplicati/httpserver/blob/master/HttpServer/ResponseCookie.cs cookie capabilities.
Unless some developer knows how to build cookies differently, getting web server fixes may be tough.
Long-term hope is that this server can be replaced with something more modern and well-maintained.

Replace custom web server with Kestrel implementation #4535 which may or may not be coupled with
Net 5 migration #3124 which is another modernization effort, and all of these things require volunteers.