I run Duplicati on Linux. I have my own custom HTTP reporting web service that is presenting an HTTPS cert signed with my internal CA (NOT a self-signed certificate, a proper root->intermediate->leaf setup, but the root is my own, not a public root.) The root is properly installed so that update-ca-certificates adds it to the system trust. Programs like curl, wget, and even python programs “just work” without any special setup or arguments.
With Duplicati v2.0.x I could update the mono CA trust (Debian plugs mono into update-ca-trust but it’s using Mono’s certmgr at the core) and the HTTP requests worked just fine. Now with v2.1.x, the mono trust store is no longer consulted and the requests just fail. This results in the backup reporting 2 warnings to the home screen that don’t appear in the backup log, and no report being sent to my web service. I DO NOT WANT to disable SSL validation entirely, nor do I want to specify a specific certificate hash that will have to be updated on a dozen different backup clients when the certificate renews every year (maybe soon 90 days based on where chromium and Mozilla are proposing). I want to trust my root cert. How do I specify a certificate chain to trust?
You can specify what certificate hash you want to trust with the option send-http-accept-specified-ssl-hash on the Http Report Module of the advanced options of the backup.
send-http-accept-specified-ssl-hash If your server certificate is reported as invalid (e.g. with self-signed certificates), you can supply the certificate hash (SHA1) to approve it anyway. The hash value must be entered in hex format without spaces or colons. You can enter multiple hashes separated by commas.
I read that as the hash of the leaf certificate, though, correct? That’s what I would expect based on similar options I’ve encountered in other programs.
I explicitly DO NOT want to do that, because when the certificate expires every year I will have to also simultaneously update a dozen different backup clients with the new hash. That’s potentially possible (though very annoying) for the machines I directly control, but getting users to edit their configuration and get it correct all on the same day is never going to happen.
I read that as the hash of the leaf certificate, though, correct?
That’s correct, it checks leaf certificates. I believe this could be a feature request.
I have quickly searched about SSL_CERT_FILE and SSL_CERT_DIR environment variables and how they can overwrite dotnet’s behavior, but it could end up affecting the certificate validation of other roots, I don’t have a setup where I could test this.
It’s useful to know those environment variables are used. I have set SSL_CERT_DIR for the server process on my canary host for 2.1 and will report back in the morning with how tonight’s backup goes.
However, this gave me an idea. If this is using the openssl library, I should be able to test the behavior with openssl s_client. And sure enough, with -verify_return_error -CApath /etc/ssl/certs the connection is terminated with a certificate error. I was able to make a successful connection using -CAfile /etc/ssl/certs/ca-certificates.crt though. So I have updated my override.conf for duplicati.service to set SSL_CERT_FILE=/etc/ssl/certs/ca-certificates.crt and I will see how tonight’s backup goes.
Still no love with SSL_CERT_FILE. Same “2 warnings” showing in the GUI with no warnings in the logs, and no report sent to my tool.
I can try to dig through the docs for libssl to see if there’s something else that can be set that changes the library behavior. Other than that, looking for other suggestions.
I believe the only way to solve it will be adding the functionality to the code to scan the whole chain for the hash specified in send-http-accept-specified-ssl-hash.
I devised a small dotnet utility to print out all the certificate validation status and the chain, if you are up to it, you can test so as to see specially what the properties Certificate validation: and SSL Policy errors evaluate to in your environment and if the chain is properly listed.
The box where I encountered this problem was my first test upgrading to 2.1. It is a Raspberry Pi that I am now questioning the SD card on and I can’t get the .NET SDK installed because I crash the box extracting the zip.
I’m going to upgrade another client. My experience with Duplicati says I will likely encounter this same issue on all of my clients. After confirming the behavior, I’ll run your utility.
Sorry for the delay in testing this out, but I do finally have a nice, normal Debian 12 x86_64 box upgraded to 2.1, and it does have the same SSL issue. I downloaded the .NET 8.0 SDK and successfully built certchaindump. Testing an endpoint with a Let’s Encrypt certificate gives me Certificate validation: True and SSL Policy errors: None which I would expect. Testing my reporting endpoint gives me Certificate validation: False and SSL Policy errors: None. The dump of the chain that follows does have the correct leaf, intermediate, and root certificates in it. I also tried export SSL_CERT_FILE=/etc/ssl/certs/ca-certificates.crt and then running the utility with the same result. And just because libssl is sometimes weird, I tried export SSL_CERT_FILE=/path/to/just/my/intermediate+root_chain.crt also with the same result.
I don’t have much expertise in .NET internals, but FWIW PowerShell is built on .NET. Invoke-WebRequest could be tested to see if it works, or puts up a Duplicati-like error.
I guess server would send the leaf and intermediate only? Does s_client show that?
The reason I ask is that the details of cert evaluation seem somewhat variable, and the DST Root CA X3 Expiration (September 2021) headache that broke mono fell into that.
Point here is just that cert validation is complex, and not always working as one may like.
Mostly, I’m curious if what’s sent would bother PowerShell, or if it’s only a Duplicati issue.
If somehow PowerShell works nicely, then a question would be what it’s doing differently.
If @madfordmac could test on Powershell that would be interesting but from a quick dive starting here and here I did not find anything indicating it would be different.
Installing PowerShell on Debian has some details, if @madfordmac feels like trying the test.
Note that I’m also wondering if exactly which certificates the server sends may affect results.
EDIT:
Another motivation would be that if PowerShell fails, it’s easier for web searches to get ideas.
The entire Internet knows more than I do, but it helps if question is available in general terms.
I installed Powershell 7.5 on my x64 test box. I was able to use Invoke-WebRequest on my endpoint without issue. Given there’s a flag to ignore certificate validation, I would assume that the default behavior is to validate certificates.
I checked my nginx config and my server is currently only serving the leaf certificate. Both the intermediate and root certificate are in the system trust store. It probably is a more typical config to have both the leaf and intermediate being served from an HTTPS endpoint. I can try adding the intermediate to my config and see if @marceloduplicati 's tool likes that better.
I added the intermediate to the server side, and confirmed with openssl s_client -showcerts that the intermediate is now being served. Interestingly, that did make s_client validate the certificate correctly without using the CApath flag. Invoke-WebRequest still validates it correctly (no surprise) but @marceloduplicati’s utility still says Certificate validation: False. I’ll check my backup in the morning and see if Duplicati happens to like it.
Both the backup client and the web server receiving the report are Debian systems. Root, intermediate, and leaf certificates generated using your favorite tool; I use the cfssl cli tools. The PEM-format intermediate and root certs are concatenated into a single file (intermediate above root) and placed in /usr/local/share/ca-certificates/ and then /usr/sbin/update-ca-certificates is run to update the trust store in /etc/ssl/.
The reporting server is running nginx+fcgiwrap to run a python cgi script that parses the report and files it in an elasticsearch database, but the cgi script doesn’t do anything with the certificate handling. A powershell or bash script that dumps stdin to a text file would be sufficient. Here’s the nginx config:
The current state of the ssl_certificate file is a concatenated chain of leaf+intermediate. I believe fcgiwrap just has the stock configuration from the package.
Duplicati is then configured with send-http-url = https://duplicati.example.net/cgi-bin/submit.cgi?id=0123456789abcdef and send-http-extra-parameters = password=fedcba9876543210. Obviously I’m validating the id and password. I don’t think it meaningfully changes a testing setup to just end the URL with the script name and skip the extra args.
If it would be easier, I can generate valid credentials and send them to you along with the actual endpoint and my SSL chain and you can test against my endpoint. I can just delete the records for the test ID later.