I wanted to add a data point about how AI usage is changing the way we write software. This story is from last week.
We recently had a problem getting two computers to communicate with each other. RavenDB uses X.509 certificates for authentication, and the scenario in question required us to handle trusting an unknown certificate. The idea was to accomplish this using a trusted intermediate certificate. The problem was that we couldn’t get our code (using .NET) to send the intermediate certificate to the other side.
I tried using two different models and posed the question in several different ways. It kept circling back to the same proposed solution (using X509CertificateCollection
with both the client certificate and its signer added to it), but the other side would only ever see the leaf certificate, not the intermediate one.
I know that you can do that using TLS, because I have had to deal with such issues before. At that point, I gave up on using an AI model and just turned to Google to search for what I wanted to do. I found some old GitHub issues discussing this (from 2018!) and was then able to find the exact magic incantation needed to make it work.
For posterity’s sake, here is what you need to do:
var options = new SslClientAuthenticationOptions
{
TargetHost = "localhost",
ClientCertificates = collection,
EnabledSslProtocols = SslProtocols.Tls13,
ClientCertificateContext = SslStreamCertificateContext.Create(
clientCert,
[intermdiateCertificate],
offline: true)
};
The key aspect from my perspective is that the model was not only useless, but also actively hostile to my attempt to solve the problem. It’s often helpful, but we need to know when to cut it off and just solve things ourselves.