Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SslStream only sends chain certs if they're in the Windows Cert Store #26323

Open
cocowalla opened this issue May 30, 2018 · 44 comments
Open

SslStream only sends chain certs if they're in the Windows Cert Store #26323

cocowalla opened this issue May 30, 2018 · 44 comments

Comments

@cocowalla
Copy link

I'm using the RabbitMQ C# Client, which under the hood uses SslStream. I'm having an issue where clients are unable to authenticate using x509 certificates if intermediate certificates are involved - such a chain looks like:

Root CA -> Issuing CA -> Issued Client Certificate

Using Wireshark I can see that when authenticating as a client, SslStream is sending only the leaf certificate, which is causing a certificate handshake error. However, if the Root CA and Issuing CA are added to the Windows Certificate Store as trusted roots, then SslStream sends all 3 certificates, and RabbitMQ is happy.

The certificate I'm using as the client cert is a PKCS#12 file that contains the whole chain (as X509Certificate2). So, the question is if there is any way to force SslStream to send the whole chain when authenticating as a client, even if the chain certs are not in the Windows Certificate Store?

@davidsh
Copy link
Contributor

davidsh commented May 30, 2018

cc: @bartonjs

@Clockwork-Muse
Copy link
Contributor

...except you're using the file, not a cert from the store, as the client id, right? Very probably it's not loading the entire chain from the file (it only actually loads one), so the leaf has a reference value to the issuer cert (thumbprint? not sure), but no idea what the actual cert is.
If I remember right, you have to load the entire set of certs into a runtime store, then grab the leaf certificate from that. The reason it works if it's in the windows cert store is that it does do lookup there by default.

@cocowalla
Copy link
Author

cocowalla commented May 31, 2018

@Clockwork-Muse I'm currently selecting the certificate from an X509CertificateCollection that contains the full chain.

I figured the solution might involve creating some kind of temporary X509Store, but if I create a unique one for my app and select a certificate from it, it's still the case that only the leaf is sent - it only seems to send the whole chain if the others certs are in the My store (strangely, not the Machine store).

If you have any info on how to create a temporary store only visible to the app and have SslStream send the whole chain, some guidance would be much appreciated.

@cocowalla
Copy link
Author

cocowalla commented Jun 1, 2018

I've checked from Linux, and observed the same behavior - the chain of intermediate certs is sent if I use the My certificate store, but not if I create a unique one with new X509Store("whatever").

I guess this is happening in SecureChannel, but that is a bit of a whale, and I'm struggling to find where in the code this actually takes please?

@cocowalla
Copy link
Author

I've managed to get it working as expected on Linux, by using the SSL_CERT_FILE environment variable:

SSL_CERT_FILE=/opt/my-app/etc/ca-bundle.crt ./My.App

When doing this, SslStream is sending the intermediate certificate as well as the leaf, and RabbitMQ is happy.

Is there anything like this for Windows? @bartonjs, I guess if anyone would know it would be you? ;)

@Clockwork-Muse
Copy link
Contributor

@cocowalla - well, often, SSL_CERT_FILE/SSL_CERT_DIR are set to a global location, so it ends up being the equivalent of the Windows certificate store (I'm currently cursing Python, because it uses about 3/4 different sets of certs during setup pulled in from different packages). I guess the closest equivalent would be certs that only certain users would have permission to access (which you'd normally want to do for the private key anyways).

I thought I'd discovered a way to load/send the entire chain when dealing with a different issue, but I can't recreate it now, possibly I just imagined it.

What's your usecase here, that you wouldn't be adding the certs to the store anyways?

@cocowalla
Copy link
Author

cocowalla commented Jun 2, 2018

Regarding SSL_CERT_FILE/SSL_CERT_DIR, these certs are only for use by this particular application, so the env var is set just for the application.

My usecase was a (failed!) attempt to keep things simple by using the same approach on both Windows and Linux (keys stored in the filesystem, protected by ACLs). I've decided to give up; I'll use SSL_CERT_FILE on Linux, and the My root and intermediate stores on Windows.

I do however still think that SslStream should have some means of providing chain certs from ephemeral stores or a X509Certificate2Collection.

@Clockwork-Muse
Copy link
Contributor

While I agree that it would be nice if SslStream would send the entire cert (which would probably require the actual cert domain types keeping their reference, or whatever), there's a problem with that: You're sidestepping the normal mechanism for chain resolution on the platforms. If the cert end up in a resource bundle or something, it ends up being more difficult for the end user to configure.

@cocowalla
Copy link
Author

cocowalla commented Jun 2, 2018

I don't really see it as side-stepping anything; more augmenting the existing mechanism. The My store would still be used, but the ephemeral store or X509Certificate2Collection the cert came from would additionally be used to locate chain certs. I imagine most devs would expect this behaviour; if you load a chain from a PKCS12 file, it seems counter-intuitive to only use the leaf cert and completely ignore the chain certs.

For your last point, I'd argue that most users would be comfortable replacing a file, and probably haven't even heard of the Windows Certificate Store ;)

@bartonjs
Copy link
Member

bartonjs commented Jun 4, 2018

I don't really know the SChannel APIs (which provide TLS for us on Windows), but I think that they only take the single certificate, then internally do the chain building for sending the intermediates.

The Linux and macOS TLS APIs are a little more raw, so they might be more amenable to such a feature, but I wouldn't add a new feature that doesn't work on Windows (71% of the usage of .NET Core, as of last summer).

@cocowalla
Copy link
Author

@bartonjs ah, that explains why I couldn't find anything in SecureChannel that sends the intermediate certs - I didn't realise it was using an SChannel API that's hard-coded to check against specific stores.

I agree it makes no sense to add this unless it works across all platforms.

@gingters
Copy link

I like to bring this issue back to the table.

We have a aspnet core based service that should be deployed onto clients windows computers. They are most likely not publicly available on the internet, and the http API that our service provides should be secured. We have a self-signed root CA and an intermediate CA and this issues a certificate for each installation. This way we can also do client-certificate auth with our service as the client.

We really want to avoid to install our root and intermediate certificates into the trusted root CA store of the clients computers. We deem this a bad security practice and really want to avoid it.

Technically it is sufficient to use certificate pinning to our root CA cert within our dedicated client application. There is no need that any computer should be needed to completely trust our cert.

So there needs to be a way to tell kestrel on windows (which uses SslStream) to not only send the actual server certificate we configured but the complete chain, without installing the chain in the trusted root store of the computer.

This is pretty counter intuitive if you create a .pfx file containing the complete chain (the server cert, the servers private key as well as both the root and intermediate CA certs), tell kestrel to use this file, and then, when its not working, find out that the X509Certificate2 only represents a single cert from the file and not the chain, and there is no other way to tell the system where to find the certificates that it should send along except for exposing the computer of your client to a risk that usually should be prevented at all costs.

So, what is the actual plan to overcome this issue? Is there anything we could do to help get this sorted out?

@davidsh
Copy link
Contributor

davidsh commented Oct 15, 2019

So, what is the actual plan to overcome this issue? Is there anything we could do to help get this sorted out?

I agree with your thoughts about the importance of this scenario. It should "just work".

As far as making progress on this issue, we need to understand what the capabilities/API of the underlying TLS stacks (SCHANNEL for Windows, OpenSsl for Linux, MacOS) are. At this point, we aren't sure the platforms support this functionality.

@msftgits msftgits transferred this issue from dotnet/corefx Jan 31, 2020
@msftgits msftgits added this to the Future milestone Jan 31, 2020
@JesperTreetop
Copy link
Contributor

JesperTreetop commented Feb 12, 2020

I have almost exactly the same situation as @gingters . We would like to send the full chain if possible - and would also like to avoid placing root certificates in a user/computer-wide store for a variety of reasons. As mentioned, telling Kestrel to use a .pfx which provides a full chain and then having them conceptually "dropped on the floor" is not a good experience, and recent efforts (#31944) to be able to read formats that are often associated with carrying the full chain highlights this.

(For anyone in the same situation who needs to have clients validate a certificate presented without its chain: if you can prime a list of potential certificates and get a grip on their chains beforehand, that's a good workaround. We can't, because the set of potential certificates is large and volatile. Our workaround right now is for the client to connect, receive the certificate, disconnect, use a secondary server or endpoint to retrieve the full chain and then connect again with the answer in hand. The alternative is to stall synchronously in the certificate validator, which is not only an incredibly bad idea but can also look hostile and malicious to the server, and is liable to be tripped up by defenses or timeouts in Kestrel or Schannel. Asynchronous certificate validation might have fixed this if this step wasn't supposed not to have huge pauses in it in the first place.)

Schannel not supporting this is a reasonable explanation why, but considering alternate stacks can provide a solution it would be good to be able to opt into them. (And yes - I realize this is a big hammer, but it would also be able to pound in many nails.)

@maryamariyan maryamariyan added the untriaged New issue has not been triaged by the area owner label Feb 23, 2020
@knapsu
Copy link

knapsu commented Mar 20, 2020

Hi. We are having the exact same problem. Certificate trust chain is broken because only the server leaf certificate is sent without the intermediate CA (and root CA). The PFX file had all the certs bundled. Very unexpected and problematic behavior.

As a workaround I am using HAProxy to terminate the SSL traffic and forward it to unencrypted port.

@bartonjs bartonjs removed the untriaged New issue has not been triaged by the area owner label Jul 7, 2020
@wfurt
Copy link
Member

wfurt commented Sep 2, 2020

the server part will be fixed in 5.0. There is now option to pass CertificateContext with all chain - perhaps loaded from pfx or PEM file. However, on Windows that will add intermediates to the store if needed - there is no other way how to deal with it as the handshake actually does not happen in the same process space. That was recommended by Windows platform developers.

The client part will need some more work and thinking.

It also seems like this morphed from client to server side back in 2019. Since this is probably too late for @cocowalla, I'm thinking about closing this and perhaps moving the server discussion to separate issue if needed - and the 5.0 behavior does not seem sufficient - #35844.

@JesperTreetop
Copy link
Contributor

Good to see some progress and that this issue is given some thought and attention. (Less good that the workaround is all that can happen on Windows, but that's not the fault of the .NET team.)

@bartonjs
Copy link
Member

bartonjs commented Jul 6, 2022

Reassigning to System.Net.Security since it's about SslStream. The original issue was about client certs, and so far the SslCertificateContext type (which fixed it for the server role) isn't available to the client role.

@ghost
Copy link

ghost commented Jul 6, 2022

Tagging subscribers to this area: @dotnet/ncl, @vcsjones
See info in area-owners.md if you want to be subscribed.

Issue Details

I'm using the RabbitMQ C# Client, which under the hood uses SslStream. I'm having an issue where clients are unable to authenticate using x509 certificates if intermediate certificates are involved - such a chain looks like:

Root CA -> Issuing CA -> Issued Client Certificate

Using Wireshark I can see that when authenticating as a client, SslStream is sending only the leaf certificate, which is causing a certificate handshake error. However, if the Root CA and Issuing CA are added to the Windows Certificate Store as trusted roots, then SslStream sends all 3 certificates, and RabbitMQ is happy.

The certificate I'm using as the client cert is a PKCS#12 file that contains the whole chain (as X509Certificate2). So, the question is if there is any way to force SslStream to send the whole chain when authenticating as a client, even if the chain certs are not in the Windows Certificate Store?

Author: cocowalla
Assignees: -
Labels:

bug, area-System.Net.Security, area-System.Security

Milestone: Future

@hbertsch
Copy link

hbertsch commented Nov 22, 2022

I think I am having the same issue here and so far (after days of trying) no solution for that.

Our backend (which we can not touch) requires that two intermediate certificates are send along with the client certificate. In Wireshark we can observe, that only the client certificate is sent. When trying this with e.g. Python Requests or CURL this does not cause any issues.

We are now thinking about switching technology since we can not come up with a proper solution for this. Does anyone have a workaround for this problem? It is a bit shocking, that something "simple" as:

curl -v -s -k\ --request GET\ --key-type PEM --key /PATH_TO_CERT_KEY --cert-type PEM --cert /PATH_TO_CERT_WITH_CHAIN https://SOMEHOSTNAME.COM

seems to be impossible to achieve with .Net Core...

This is the structure of our certificate:

-----BEGIN CERTIFICATE-----
// Client cert
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
// intermediate
-----END CERTIFICATE-----
-----BEGIN CERTIFICATE-----
//intermediate
-----END CERTIFICATE-----

@rzikm
Copy link
Member

rzikm commented Nov 22, 2022

Does anyone have a workaround for this problem?

The workaround is to install the client intermediate certificates into the Windows certificate store (I think the "My" store is the right place).

@bartonjs
Copy link
Member

I think the "My" store is the right place

Intermediates should go in StoreName.CertificateAuthority, which the Windows Certificate Store UI calls "Intermediate Certificate Authorities". (They'll probably work in the My store, but that's not the expected place)

@hbertsch
Copy link

Well, I noticed, that I can access my (mac OS) store using a X509Store. However, this is only my dev machine and we have to deploy this stuff to Microsoft Azure on Linux driven function apps. Did not test this yet, but might give it a try, since @cocowalla wrote:

I've checked from Linux, and observed the same behavior - the chain of intermediate certs is sent if I use the My certificate store, but not if I create a unique one with new X509Store("whatever").

So I was thinking to maybe add and remove the intermediate certificates on the fly when executing our tests (context: we have hundreds of virtual entities, that are alle equipped with client certificates. Therefore it would be cool not to "pollute" the systems too much by adding certificates to the store).

Thank you @bartonjs for the hint with the CertificateAuthority, I will have a look into this tomorrow!

@wfurt
Copy link
Member

wfurt commented Nov 22, 2022

This will probably be solved by #71194 in 8.0. Note that on Windows there is no API to pass specific certificates. SslStreamCertificateContext puts the certificates to stores automatically as needed so callers do not need to worry about it. This is primarily about intermediates. The certificate AND key still needs to be in a store on Windows as the handshake happens in separate process. (#23749)

@hbertsch
Copy link

hbertsch commented Nov 23, 2022

Hi @bartonjs @rzikm I wrote the following code and tried to send it out. Should this be working? I still don't get the certificates sent with the request (no matter if I use StoreName.CertificateAuthority or StoreName.My). From what I understand, the SslStreamCertificateContext should now grab the certificates implicitly from the store when sending out the request:

 private void TestRegistration(X509Certificate2 clientCert, X509Certificate2 l1, X509Certificate2 l2,  X509Certificate2 root, string payload)
    {
        if (clientCert.HasPrivateKey == false)
            throw new Exception("Client certificate is missing private key");

        // Prime Root store
        var storeRoot = new X509Store(StoreName.AuthRoot);
        storeRoot.Open(OpenFlags.ReadWrite);
        storeRoot.Add(root);
        storeRoot.Close();

        // Prime CA store
        var storeAuthorities = new X509Store(StoreName.CertificateAuthority);
        storeAuthorities.Open(OpenFlags.ReadWrite);
        storeAuthorities.Add(l1);
        storeAuthorities.Add(l2);
        storeAuthorities.Close();

        var handler = new HttpClientHandler();
        handler.ClientCertificateOptions = ClientCertificateOption.Manual;
        handler.SslProtocols = SslProtocols.Tls12;

        // Prime certificate store
        var storeCerts = new X509Store(StoreName.My);
        storeCerts.Open(OpenFlags.ReadWrite);
        storeCerts.Add(clientCert);

        // Freshly fetch the client cert from store
        var storedCertificate = storeCerts.Certificates.Where(x => x.SerialNumber == clientCert.SerialNumber).First();
        if (storedCertificate.HasPrivateKey == false)
            throw new Exception("Client certificate is missing private key");

        handler.ClientCertificates.Add(storedCertificate);
        storeCerts.Close();

        handler.ServerCertificateCustomValidationCallback =
            (httpRequestMessage, cert, cetChain, policyErrors) =>
            {
                return true;
            };

        var content = new StringContent(payload, System.Text.Encoding.UTF8, "application/json");
        var client = new HttpClient(handler);
        var result = client.PutAsync("https:/MY_BACKEND_HOSTNAME.com/api/register",
            content).GetAwaiter().GetResult();

        if (result.StatusCode != HttpStatusCode.OK)
        {
            var statuscode = result.StatusCode;
            throw new SuccessException("Registration failed with " + statuscode.ToString());
        }
    }

@JesperTreetop
Copy link
Contributor

store.Add(clientCert);

I do not know if this is the error, but you are adding the client certificate to the Certificate Authority store too. You may need to add it to the .My store. You may also need to Close/Dispose a store for the changes to take effect - I don't see anything confirming or denying this in the documentation.

@hbertsch
Copy link

Hi @JesperTreetop , I updated the code above ^ , to now have two stores that are closed before sending the request. Still not working. Is is pretty hard to guess what is going on in the back when doing it like this. I was wondering how the client will know, that the intermediates belong to the client certificate added to handler.ClientCertificates.Add(storedCertificate)

@JesperTreetop
Copy link
Contributor

I was wondering how the client will know, that the intermediates belong to the client certificate

In the standard X509 certificate chain-of-trust way. The client certificate is signed/issued by the intermediate certificate and the intermediate certificate is signed/issued by the root CA certificate. Each certificate also contains "issuer" metadata through which you can find an issuing certificate. See "Issuer" here, for example.

Come to think of it, per the documentation, the intermediate certificate should go in the StoreName.CertificateAuthority store and the root CA certificate should go in the StoreName.Root store. The intermediate certificate would be unable to find the root CA certificate (which issued the intermediate certificate) if only looking in the root store if it hadn't been added to the root store.

@hbertsch
Copy link

hbertsch commented Nov 23, 2022

Hi @JesperTreetop, thank you for looking into this. I have now (again) updated the code above ^ and put the 3rd party root certificate into the StoreName.AuthRoot. Unfortunately this also did not help. In case you wonder, this is the certificate structure:

root
|_ intermediate L1
     |_ intermediate L2
          |_ client certificate 

They form a valid chain, which I also checked using the openssl verify command. PS: I know that is odd, that the client needs to send the L2 and L1 certs to the server, but unfortunately this is out of my hands and we simply must do it this way :/

@JesperTreetop
Copy link
Contributor

I think maybe this is better answered by someone involved with that code. I'm just some dude guessing. It would be great if this just worked as intended out of the box.

@wfurt
Copy link
Member

wfurt commented Nov 23, 2022

You should not need to add to AuthRoot That modifies trust so it can be dangerous. Putting intermediates to CertificateAuthority makes them available but not trusted. Now, are you on Linux @hbertsch? Just confirming as the original issue is about Windows. On Linux, the client certificate does not need to be in X509Store as all the handshake happens in application process.

You can add X509Chain.Build(clientCert) just before handler.ClientCertificates.Add to verify that the chain could be constructed. If not, we need to debug that part.

Last part is the certificate itself. Does it it have clients attributes e.g. proper KU/EKU? (#26531)

as far as sending the intermediates: RFC for tls 1.2 states that client should send them. Without it, the server may have difficulty to construct the chain if the certificate some from different CAs.

@hbertsch
Copy link

hbertsch commented Nov 24, 2022

Hi @wfurt, many thanks for your feedback.
We have Macs as developer machines but the target infra is Linux:

Well, I noticed, that I can access my (mac OS) store using a X509Store. However, this is only my dev machine and we have to deploy this stuff to Microsoft Azure on Linux

I am sorry if this caused any confusion. To me it looked like a related problem. Should I move this to a new thread?

I tried to use the X509Chain to check the certificate chain validity with success.

         var chain = new X509Chain();
        chain.ChainPolicy.RevocationFlag = X509RevocationFlag.EntireChain;
        chain.ChainPolicy.RevocationMode = X509RevocationMode.NoCheck;
        chain.ChainPolicy.TrustMode = X509ChainTrustMode.CustomRootTrust;
        chain.ChainPolicy.CustomTrustStore.Add(root);
        chain.ChainPolicy.CustomTrustStore.Add(l1);
        chain.ChainPolicy.CustomTrustStore.Add(l2);

        var buildSuccess = chain.Build(clientCert);
        if (buildSuccess == false)
            return;

        if (chain.ChainElements.Count != 4)
            return;

        // HttpClientHandler 
        handler.ClientCertificates.Add(clientCert);

The KU & EKU of the client certificate are:

 X509v3 Key Usage: critical
                Digital Signature, Non Repudiation, Key Agreement
 X509v3 Extended Key Usage: 
                TLS Web Server Authentication, TLS Web Client Authentication

@wfurt
Copy link
Member

wfurt commented Nov 30, 2022

You should not add the intermediates to CustomTrustStore as there is no way how HttpClient can do it at the moment.
The test is to verify that the chain can be build just with OS trust or by adding intermediates to StoreName.CertificateAuthority store. If it cannot, SslStream and HttpClient won't be able to send it. Does it make sense?

@hbertsch
Copy link

hbertsch commented Mar 28, 2023

Good day @wfurt, due to other urgent project tasks we postponed the implementation of this particular part. Now I am turning back to this and still was not able to figure out how to get this to work.

I hope it is ok if we continue discussing this in this thread since we are developing on macOS and will deloy it on linux on Azure.

To summarize what I need to achive: we need to send the leaf- and two intermediate certificates to the server (not the root certificate) for the server beeing able to trust the client.

I was able to validate the chain using the following code:

[Test]
    public void CheckCertificateChainValidity()
    {
        var leafCert = X509Certificate2.CreateFromPem(leafCertificate64.ToCharArray());
        var lvl2Intermediate = X509Certificate2.CreateFromPem(LVL_2_Intermediate64.ToCharArray());
        var lvl1Intermediate = X509Certificate2.CreateFromPem(LVL_1_Intermediate64.ToCharArray());
        var rootCa = X509Certificate2.CreateFromPem(rootCa64.ToCharArray());

        // Create a new chain and add the certificates to it
        var chain = new X509Chain();
        chain.ChainPolicy.RevocationFlag = X509RevocationFlag.EntireChain;
        chain.ChainPolicy.RevocationMode = X509RevocationMode.NoCheck;
        chain.ChainPolicy.TrustMode = X509ChainTrustMode.CustomRootTrust;
        chain.ChainPolicy.VerificationFlags = X509VerificationFlags.AllFlags;
        chain.ChainPolicy.CustomTrustStore.Add(lvl2Intermediate);
        chain.ChainPolicy.CustomTrustStore.Add(lvl1Intermediate);
        chain.ChainPolicy.CustomTrustStore.Add(rootCa);

        chain.Build(leafCert);

        // Check if the chain is valid
        var chainIsValid = chain.ChainStatus.Length == 0;

        if (!chainIsValid)
        { 
            var statusText = chain.ChainStatus.First().StatusInformation;
        }

        Assert.That(chainIsValid, Is.EqualTo(true));
        
        // Clean up
        chain.Reset();
    }

However, I do not understand how to properly add / store the certificates, so that the HttpClient (and as of your explanation the SslStream) can use and send them to the server.

I did not add those certificates to my system store (macOS Keychain) manually. This should also not be the goal, since the application is build to test a large sum of devices (clients) where each device comes with its own client certificate and might have differing intermediate certificates etc.

So I tried using the StoreName.CertificateAuthority without success. Would you please have a look at the code below? Maybe I have a fundamental problem understanding the use of the X509Store in conjunction with HttpClient / HttpClientHandler.

[Test]
    public async Task RegisterDevice()
    {
        var leafCert = X509Certificate2.CreateFromPem(leafCertificate64.ToCharArray());
        var lvl2Intermediate = X509Certificate2.CreateFromPem(LVL_2_Intermediate64.ToCharArray());
        var lvl1Intermediate = X509Certificate2.CreateFromPem(LVL_1_Intermediate64.ToCharArray());
        var rootCa = X509Certificate2.CreateFromPem(rootCa64.ToCharArray());

        // Check the chain validity before continuing
        Assert.DoesNotThrow(() => CheckCertificateChainValidity());

        // Add Root CA to System CA Store
        var store = new X509Store(StoreName.CertificateAuthority);
        store.Open(OpenFlags.ReadWrite);
        store.Add(lvl2Intermediate); // probably not required
        store.Add(lvl1Intermediate);  // probably not required
        store.Add(rootCa);
        store.Close();

        var handler = new HttpClientHandler();
        handler.ServerCertificateCustomValidationCallback = (sender, certificate, chain, sslPolicyErrors) => true;
        handler.ClientCertificateOptions = ClientCertificateOption.Manual;
        handler.SslProtocols = System.Security.Authentication.SslProtocols.Tls12;
        handler.CheckCertificateRevocationList = false;

        var key = ECDsa.Create("ECDsa");
        var keybytes = Convert.FromBase64String(leafKey);
        key.ImportECPrivateKey(keybytes, out _);
        var leafCertWithKey = leafCert.CopyWithPrivateKey(key);
        
        // add certificates to the http handler
        handler.ClientCertificates.Add(leafCertWithKey);
        handler.ClientCertificates.Add(lvl2Intermediate);
        handler.ClientCertificates.Add(lvl1Intermediate);

        var httpClient = new HttpClient(handler);
        var xmlContent = File.ReadAllText(payloadPath);
        var requestContent = new StringContent(xmlContent, Encoding.UTF8, "application/xml");
        var response = await httpClient.PostAsync(uri, requestContent);

        Assert.That(response.StatusCode, Is.EqualTo(HttpStatusCode.OK));
    }

The result is always:

leafOnly

I would be happy to discuss this in person if it is of any help.

@rzikm
Copy link
Member

rzikm commented Mar 28, 2023

@hbertsch

I did not add those certificates to my system store (macOS Keychain) manually. This should also not be the goal, since the application is build to test a large sum of devices (clients) where each device comes with its own client certificate and might have differing intermediate certificates etc.

In 8.0, we have added support for SslClientAuthenticationOptions.ClientCertificateContext, where you can specify a context created via `SslCertificateContext.Create(certificate, intermediateCerts, ...). It will internally build the chain and if it builds, the chain will be used and sent over the wire.

Unfortunately, you can't provide the entire chain to SslClient/HttpClient in 7.0 (or earlier). You can provide only the (leaf) client certificate and SslStream will build the SslStreamCertificateContext internally (without specifying additional intermediates), so if the intermediates are somehow custom (i.e. not present in the system-wide intermediate certs stores), they will not be found and will not be sent over the wire.

@hbertsch
Copy link

hbertsch commented Mar 28, 2023

@rzikm that is a pitty. I guess, since 8.0 is currently in preview, it will take a while until it is supported by Azure Functions and therefore not usable (for us) as of now.

Could you give an example of how SslCertificateContext.Create(... would be best implemented in aboves test (for dotnat 8.0)? This would be helpfull.

Many thanks!

Edit

When adding both intermediate and the root certificate to the macOS system truststore (Keychain) and trusting the root it is working (dotnet 6.0):

Bildschirm­foto 2023-03-28 um 10 37 31

@wfurt
Copy link
Member

wfurt commented Mar 28, 2023

yes, it should as long as X509Chain can find them. (watch for #80490 on macOS)
As far example, you can perhaps look at our tests @hbertsch like

public async Task SslStream_ClientCertificateContext_SendsChain()

@hbertsch
Copy link

hbertsch commented May 4, 2023

Hello again @wfurt , I had a closer look into the referenced tests and understand the following:

# clientCertificate = the X509Certificate2 client certificate including the private key 
# clientChain = a X509Certificate2Collection containing the root and leaf certificates from which the client clientCertificate is derived from
# TargetHost = whatever hostname our server has

var clientOptions = new SslClientAuthenticationOptions()
{
      TargetHost = "localhost",
};
# ignores server certificate validation errors
clientOptions.RemoteCertificateValidationCallback = (sender, certificate, chain, sslPolicyErrors) => true;
# this is where the magic happens and the SSL stream gets the context to be used with the clientCertificate. After setting the certificate and the chain here, the chain should be sent alongside with the client certificate in dotnet v8
clientOptions.ClientCertificateContext = SslStreamCertificateContext.Create(clientCertificate, clientChain);

But how can this be used with our existing implementations where we use the HttpClient from System.Net.Http? Should the HttPClient implicitly use the updated SslStream? Is this currently only supported when working with raw SslStreamobjects?

I am on 8.0.100-preview.2.23157.25 of the sdk

Update:

I guess I have awnsered my own question by playing around with the code now. I don't know if there is a better way but if I use this approach, all certificates are sent and we have a successful response:

...
        // Set client certificate options
        var clientOptions = new SslClientAuthenticationOptions()
        {
            EnabledSslProtocols = System.Security.Authentication.SslProtocols.Tls12,
            AllowRenegotiation = true
        };

        clientOptions.RemoteCertificateValidationCallback = (sender, certificate, chain, sslPolicyErrors) => true;
        clientOptions.ClientCertificateContext = SslStreamCertificateContext.Create(leafCertWithKey, chain);
        
         // Create an HTTP client with the SSL stream as the transport
        var httpClient = new HttpClient(new SocketsHttpHandler
        {
            SslOptions = clientOptions
        });

        // var httpClient = new HttpClient(handler);
        var requestContent = new StringContent(xmlContent, Encoding.UTF8, "application/xml");
        var response = await httpClient.PutAsync(uri, requestContent);

        Assert.That(response.StatusCode, Is.EqualTo(HttpStatusCode.OK));
...

Thanks for your help here. Looking forward to dotnet v8 LTS :)

@tomrus88
Copy link

tomrus88 commented Mar 16, 2024

Why is this still not fixed and we are forced to create different workarounds?

Yesterday I've discovered that when you use SslServerAuthenticationOptions/SslStreamCertificateContext classes, certificates are being secretly added into Windows Certificate store without my consent. This is unacceptable and there's should be a way for developer to provide what he wants to send and it should just work without polluting random system stores...

After 6 years Microsoft still not fixed such a simple bug... Unbelievable...

@wfurt
Copy link
Member

wfurt commented Mar 17, 2024

After 6 years Microsoft still not fixed such a simple bug... Unbelievable...

feel free to contribute. This is open source project. This comes from limitation of the underlying schannel. If you know way how to do it please share it @tomrus88

@bartonjs
Copy link
Member

Yesterday I've discovered that when you use SslServerAuthenticationOptions/SslStreamCertificateContext classes, certificates are being secretly added into Windows Certificate store without my consent.

Presumably the CA (Intermediates) store? Windows (the OS) does that when building a chain that a) it had to go find the intermediate and b) it ultimately trusted. That saves it the work of repeating "find it" in the future. The CA/Intermediates store implies no trust, the OS just treats it as a cache.

@rzikm
Copy link
Member

rzikm commented Mar 18, 2024

I believe he means this particular piece of code.

// OS failed to build the chain but we have at least some intermediates.
// We will try to add them to "Intermediate Certification Authorities" store.
if (!osCanBuildChain)
{
X509Store? store = new X509Store(StoreName.CertificateAuthority, StoreLocation.LocalMachine);
try
{
store.Open(OpenFlags.ReadWrite);
}
catch
{
// If using system store fails, try to fall-back to user store.
store.Dispose();
store = new X509Store(StoreName.CertificateAuthority, StoreLocation.CurrentUser);
try
{
store.Open(OpenFlags.ReadWrite);
}
catch
{
store.Dispose();
store = null;
if (NetEventSource.Log.IsEnabled())
{
NetEventSource.Error(this, $"Failed to open certificate store for intermediates.");
}
}
}
if (store != null)
{
using (store)
{
// Add everything except the root
for (int index = count; index < intermediates.Count - 1; index++)
{
store.Add(intermediates[index]);
}
osCanBuildChain = chain.Build(target);
foreach (X509ChainStatus status in chain.ChainStatus)
{
if (status.Status.HasFlag(X509ChainStatusFlags.PartialChain) || status.Status.HasFlag(X509ChainStatusFlags.NotSignatureValid))
{
osCanBuildChain = false;
break;
}
}
if (!osCanBuildChain)
{
// Add also root to Intermediate CA store so OS can complete building chain.
// (This does not make it trusted.
store.Add(intermediates[intermediates.Count - 1]);
}
}
}
}

@ygoe
Copy link

ygoe commented Aug 18, 2024

This is a very long discussion and I can't follow it entirely. I have an SslStream on the server side and need it to present a server certificate along with the CA certificate that signed it (both selfmade, in a local network). Did I understand it correctly that Windows 11 is uncapable of providing that? And .NET 8 relies on that Windows thing that's not suitable for the task? Can we please have OpenSSL in .NET on every platform if Windows doesn't support such basic scenarios? OpenVPN handles selfmade CA and other certificates effortlessly. So I assume that the Windows kernel isn't needed for SSL. (I'd even consider it dangerous to let the privileged kernel handle such complex data from the network!)

@rzikm
Copy link
Member

rzikm commented Aug 19, 2024

@ygoe

I have an SslStream on the server side and need it to present a server certificate along with the CA certificate that signed it (both selfmade, in a local network). Did I understand it correctly that Windows 11 is uncapable of providing that?

This issue is quite old and there have been some changes made since it was filed. In currently supported versions of .NET, you can construct SslStreamCertificateContext with the leaf+intermediate certificates you want your app to use and then supply it to the authentication options. It is fully supported for server side, but for client side, the property was added only in .NET 8 and there is no way to explicitly provide cert chain for the client side.

See specifically these two comments

Can we please have OpenSSL in .NET on every platform if Windows doesn't support such basic scenarios?

There are many reasons why we can't use OpenSSL on Windows. Short version is that .NET tries very hard not to ship any cryptograpic code and relies on libraries ubiquitously present on the target platform. This is very much off-topic, but just to humor you I will list a few obstacles to relying on OpenSSL on windows:

  • Shipping openssl - openssl is not present on all Windows machines by default, .NET installations would have to redistribute the openssl binaries, which has implications for the following points
  • Security - imagine having self-contained application running somewhere, self-contained means the openssl binaries need to be packaged with the applications, now a critical vulnerability gets discovered, to protect your app, you need to wait until a new version of .NET with fixed openssl bin is available, and rebuild and redeploy your application (i.e. significant delay)
  • Security (again) - Windows architecture regarding certificates works in such a way that the private keys are never in the memory of the program itself (they are loaded by a privileged process lsass and all operations happen over IPC with the applications), this means that private keys cannot be dumped from your applications memory.
  • FIPS compliance - some customers require all crypto in an application to be FIPS compliant, since .NET would ship openssl binaries, it would be our burden to make sure the binaries we ship are FIPS compliant.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests