HTTPS certificates in the age of quantum computing

lwn.net · firexcy · 2 days ago · view on HN · security
0 net
Tags
HTTPS certificates in the age of quantum computing [LWN.net] LWN .net News from the source Content Weekly Edition Archives Search Kernel Security Events calendar Unread comments LWN FAQ Write for us Edition Return to the Front page User: Password: | | Log in / Subscribe / Register HTTPS certificates in the age of quantum computing [LWN subscriber-only content] Welcome to LWN.net The following subscription-only content has been made available to you by an LWN subscriber. Thousands of subscribers depend on LWN for the best news from the Linux and free software communities. If you enjoy this article, please consider accepting the trial offer on the right. Thank you for visiting LWN.net! Free trial subscription Try LWN for free for 1 month: no payment or credit card required. Activate your trial subscription now and see why thousands of readers subscribe to LWN.net. By Daroc Alden March 11, 2026 There has been ongoing discussion in the Internet Engineering Task Force (IETF) about how to protect internet traffic against future quantum computers. So far, that work has focused on key exchange as the most urgent problem; now, a new IETF working group is looking at adopting post-quantum cryptography for authentication and certificate transparency as well. The main challenge to doing so is the increased size of certificates — around 40 times larger. The techniques that the working group is investigating to reduce that overhead could have efficiency benefits for traditional certificates as well. Authentication When a browser connects to LWN.net, it first establishes an ephemeral encryption key to protect the session. This is key exchange, and some browsers and servers are already using the post-quantum cryptography standardized in 2024 to avoid "store now, decrypt later" attacks. Attacks of this kind store encrypted traffic for later, in the hope that future quantum computers will be able to break the key-exchange mechanisms used. The possibility of these kinds of attacks makes it important to deploy quantum-resistant key-exchange mechanisms well in advance of quantum computers becoming practically usable. Next, the server provides the browser with a certificate that proves it actually is LWN.net — authenticating the connection. That certificate is made up of a chain of signatures, where each signature comes from a "more trusted" organization, and verifies that the next public key in the list is valid. In our case, this means that the server will send three signatures to the browser: one from LWN.net, one from Let's Encrypt , and one from the Internet Security Research Group 's (ISRG) X1 Root certificate. With traditional cryptography, this certificate is approximately 3.5KB, which is roughly one third of the entire LWN front page's HTML content, after compression. These signatures aren't subject to "store now, decrypt later" attacks in the same way encryption keys are, because compromising an authentication key later doesn't impact the correctness of the connection now. Therefore, while key-exchange mechanisms need to defend against future quantum computers to keep communications private, authentication mechanisms only need to defend against current computers. Depending on the algorithm in question, post-quantum cryptography can produce signatures much larger than comparable traditional algorithms. ML-DSA-44, which is a standardized post-quantum signature scheme thought to have security similar to Ed25519 signatures, produces signatures 37 times larger. Naively adopting post-quantum signatures for authentication could cause certificate chains to take up more data than the actual content of the web site in question, at least for small, text-heavy web sites like LWN. To ensure that certificate authorities are issuing certificates according to their stated policies , many certificates also include signatures from certificate-transparency logs, which publish a list of all certificates being issued. Browsers will typically refuse to trust a certificate authority that does not participate in certificate transparency, since it is much harder to tell if the certificate authority is issuing certificates that it shouldn't. The extra bandwidth overhead that would be incurred by a direct switch to post-quantum signatures for authentication would have a measurable impact on the overall latency of connections. Logging The solution that the new working group (called "PKI, Logs, and Tree Signatures" or PLANTS) has been discussing inverts the relationship between signatures from certificate authorities and the transparency logs. Currently, a certificate authority first creates a certificate, then logs it in a certificate-transparency log, and then optionally includes the signature from the log in the certificate as a piece of additional information. This is, in some sense, redundant: the information that the certificate is valid is already present in the certificate-transparency log, so why send the client any information other than proof that it appears in the log? The mechanism PLANTS proposes would have each certificate authority maintain its own append-only issuance log, containing a list of every certificate it has issued. The same organizations that run certificate-transparency logs today would monitor and mirror each certificate authority's log to ensure compliance. They primarily check that the log is actually append-only, so that a certificate authority can't backdate changes to issued certificates. That way, if there is a security problem caused by a misbehaving certificate authority it will be easy to prove that and revoke trust in the authority. Instead of having a chain of signatures in a certificate to represent some transitive relationship between a certificate authority and a root of trust, the third-party observers would add their signatures to a certificate authority's log as they validate it. A browser can choose its own criteria for which third-party observers it trusts, and whether it requires a quorum of them before accepting the state of an issuance log. The certificate seen by the client would therefore no longer be a chain of signatures leading back to a root of trust: it would be a set of signatures from the certificate authority and any relevant observers attesting to the state of the issuance log, plus a proof that the web server's public key was included in the issuance log. This constitutes what PLANTS calls a "full" certificate. For an individual web site, a full certificate doesn't decrease the number of needed signatures; but since the issuance logs are append-only, if a browser has already verified the issuance log for a certificate authority up to some checkpoint, it doesn't need to see the signatures for that checkpoint again. Instead, it can ask the server to just send the proof that the server's public key appeared in the log prior to that point — a "signatureless" certificate that should be substantially smaller. Merkle trees Those proofs use Merkle trees , a cryptographic commitment scheme which uses a small number of hashes to show that a leaf node belongs to a binary tree. That way, a certificate authority can batch up a large number of certificates into a single tree that only needs to be signed once. The overall number of signatures to be made and verified becomes independent of the number of issued certificates. The idea is that each internal node of the tree stores the hash of its children, all of the way up to the root. To prove that a leaf node (such as node 10, above) belongs to a tree with a given root node, it suffices to provide the hashes of the other nodes that are "adjacent" to the path from the leaf to the root (nodes [8,10), 11, and 12). This lets a verifier reconstruct what the root hash would be if 10 were included, and then check that this matches what the root hash actually is. In the above example, the browser would calculate the hash of 10, then the hash of [10,12), then the hash of [8,12), before finally calculating the root hash and ensuring that it matches. The number of additional hashes to provide grows logarithmically with the size of the tree. And, since cryptographic hashes aren't vulnerable to the same kinds of quantum attacks as public-key cryptography, the size of a Merkle inclusion proof like this doesn't change when switching to post-quantum cryptography. Let's Encrypt issues around six million certificates per day (although that number is expected to go up as the standardized certificate lifetime goes down over the next several years). If it adopted the new system and created a new checkpoint every minute (meaning that a server could obtain a full certificate right away, but would need to wait up to one minute to obtain a signatureless certificate), each certificate would need to include twelve hashes (totaling 384 bytes for SHA-256 hashes). That is only 16% the size of a single ML-DSA-44 signature. Of course, servers will ideally have both: a full certificate for clients that have not seen a recent checkpoint, and a signatureless certificate for clients that have. The full certificate will be significantly larger than current certificates (around 133KB using ML-DSA-44), but it hopefully only needs to be used by a small fraction of connections. To prevent an issuance log from growing without bound, older entries are periodically pruned as they expire. This might seem to be at odds with the append-only nature of an issuance log, but since expired certificates shouldn't validate correctly anyway, the certificate authority can delete the corresponding leaf nodes and any internal nodes that are therefore no longer usable in actual proofs. The tree maintains the same conceptual size, but the on-disk storage requirements remain proportional to the number of active certificates. Revoking mistakenly issued certificates, which don't have the decency to expire at known times, is a little more complicated: along with the issuance log, a certificate authority also maintains a set of revoked certificates. This set of revocations is also covered by the signatures of each checkpoint, so a browser will obtain an updated set of certificate revocations every time it validates a full certificate from a given certificate authority. Adoption The PLANTS working group is still in the early stages of the standardization process — a draft standard exists, but it has not yet been proposed for standardization, and probably won't be within the next year, since several details remain to be worked out. Despite that, Google has announced a plan to evaluate the performance impacts of Merkle-tree-based certificates in Chrome, and to deploy an experimental post-quantum certificate-authority system based on the PLANTS draft by the end of 2027. Most likely, certificate authorities, server operators, and users won't need to update any of their configurations until 2029 or 2030. The key question Google hopes to answer, which will impact the usability of the protocol, is whether clients will actually stay up-to-date enough (by occasionally verifying a full certificate) to benefit from signatureless certificates on average. The whole protocol only provides a bandwidth and latency advantage if actual browsers visit enough distinct web sites with certificates issued by the same certificate authority. Over the next several months, the Google Chrome team will hopefully provide empirical data on that question. Users of non-Chrome browsers will go unmeasured, at least for now. Hopefully other browser projects will either join the experiment, or have browsing patterns that are statistically similar enough to Chrome's userbase to draw reasonable conclusions. Changes in web infrastructure often take a significant amount of time. Between running experiments, standardizing the protocol, and rolling out the changes to certificate authorities and browsers, it may be a long time before we see real connections authenticated with post-quantum cryptography. Still, even the most pro-quantum-computing estimates suggest that the system will be in place before quantum computers can pose a real threat to the security of authentication. In a world that is increasingly hectic, it's nice to occasionally have a security concern that is handled well before it becomes an actual problem. Did you like this article? Please accept our trial subscription offer to be able to see more content like it and to participate in the discussion. to post comments Premature Posted Mar 11, 2026 15:37 UTC (Wed) by ikm (subscriber, #493) [ Link ] (10 responses) > it's nice to occasionally have a security concern that is handled well before it becomes an actual problem I have to disagree here. In the most cases I've seen solving non-existent problems preemptively is rarely beneficial. Given that this one isn't even a store-now-decrypt-later, that no real capable enough quantum computers are on the horizon, and the prior history of finding problems with post-quantum algorithms, I'd rather not see this deployed at this point. It just creates headaches and overhead for no reason that is good enough right now. Premature Posted Mar 11, 2026 15:51 UTC (Wed) by daroc (editor, #160859) [ Link ] Well, the merkle-tree-based certificates do work with traditional, non-quantum signatures as well. The bandwidth benefits there are potentially less extreme, but that's a matter for measuring people's actual browsing behavior. I do see your point, though. Certainly nobody should be adopting a new, purely post-quantum signature scheme without accompanying it with a traditional signature scheme. Luckily, NIST has standardized a set of secure ways to combine multiple signature schemes, so using hybrid authentication and encryption gives the best of both worlds. Premature Posted Mar 11, 2026 20:54 UTC (Wed) by stevie-oh (subscriber, #130795) [ Link ] (8 responses) Peter Gutmann's short story "On the Heffalump Threat" is a rather poignant allegory for this whole thing. Premature Posted Mar 11, 2026 23:30 UTC (Wed) by PeeWee (subscriber, #175777) [ Link ] (7 responses) Funny you should mention Gutmann, as I keep being reminded of his "Bollocks" talk at any mention of "post-quantum". I've written a comment on my take on the whole quantum "computing" bubble on the Phoronix forum. At this point I am almost certain the stakeholders involved all but know that it cannot be done; Schrödinger's Cat cannot be alive and dead at the same time. As soon as that box is opened, it is either dead or alive, not both, but "quantum computing" scientists are trying to make us believe that it can be made so. I believe that in some future physics text books "quantum computers" will have their place right next to Perpetuum Mobiles, or perpetual motion machines. Premature Posted Mar 12, 2026 9:49 UTC (Thu) by ballombe (subscriber, #9523) [ Link ] (2 responses) The real problem with QC is the economic: so far there are very few quantum algorithms that are faster than classical and so very few applications outside breaking crypto, so this is not get you money especially once post quantum crypto is deployed. So there is little economic incentive to continue to pursue QC when LLM is getting all the money. Premature Posted Mar 12, 2026 11:47 UTC (Thu) by anselm (subscriber, #2796) [ Link ] (1 responses) So there is little economic incentive to continue to pursue QC when LLM is getting all the money. The QC guys are trying to hang around in the hope that the VCs will eventually figure out that LLMs are so much hot air, so QC can be the focus of the next big hype cycle. Much like a few years ago when the VCs eventually figured out that blockchains were so much hot air, so LLMs became the focus of the next big hype cycle. (The only problem is that once the LLM bubble has popped, the VCs may not have any money left for QC.) QC, like generative AI, certainly sits in an enticing hotspot that combines “looks like innovative hot-shit guru-level stuff”, “seems maybe a bit lame today but will be totally, unbelievably, mind-blowingly great and absolutely indispensable in 5 years' time according to the people hyping it now”, and “fiendishly expensive” in a way that is hard to pass over if you have money to burn. QC's hype cycle Posted Mar 12, 2026 13:48 UTC (Thu) by farnz (subscriber, #17727) [ Link ] The thing that could kill the QC hype cycle early is if quantum complexity theory can determine the relationship between BQP and NP. The reason there's room for the sleight of hand merchants to hype QC is that we don't know how NP and BQP relate; the best we can do is to say that there are some problems (such as integer factorization) in both NP and BQP, that all problems in P are in both NP and BQP, and that we believe that some NP problems (like travelling salesman) are in NP but not BQP. But the hype merchants can argue that until Shor's algorithm was discovered, integer factorization was in NP only, and that there might be similar algorithms for a NP-complete problem. Quantum Computers do work in principle Posted Mar 12, 2026 12:12 UTC (Thu) by chris_se (subscriber, #99706) [ Link ] (2 responses) > At this point I am almost certain the stakeholders involved all but know that it cannot be done As someone with a background in theoretical condensed matter physics I disagree here. If your claim is just "the commercial quantum computing companies are vastly overselling what they can achieve" - sure, I think that's trivially true. If you say "we're at least 10, more likely at least 20 years away from a quantum computer that'll be even remotely useful" - I'd tend to agree. But I do think that people working on this genuinely believe that this will be possible eventually. There _are_ macroscopic quantum states (e.g. Bose-Einstein-Condensates). There _are_ experiments that show that quantum coherence can survive in really weird conditions (back in 1999 people have demonstrated double-slit interference with C60 molecules, that was quite a famous paper in Nature back then). These things, while not intrinsically useful for quantum computers, indicate that if you can manage the corresponding interactions well enough you can keep a quite large quantum state around. The main issues with current quantum computer designs are 1) scaling, 2) coherence times, and 3) control. While it is easily possible to optimize for one of these quantities, nobody has managed to optimize for all 3 of these quantities at the same time as of yet. But there's no intrinsic limit given by the known laws of physics why that should be impossible - it's just a _really_, _really_ hard problem to solve. There have been improvements over the last years though. And in contrast to Schrödinger's cat, we're talking about the coherence of a couple of million of qubits that's needed for useful computations, i.e. 10^6 - 10^7, and at very low temperatures near absolute zero, not the coherence of all ~10^26 atoms in a cat at room temperature. We're still ways off from that, but the comparison to Schrödinger's cat is really misleading in my eyes. As for post-quantum cryptography: I'm very sympathetic to the argument that we don't need to deploy signatures as of right now, because key exchange is the only thing we need to worry about _at this moment_, since signatures will only be needed once a quantum computer exists. But I do think that implementing a hybrid EC + PQC scheme for key exchange is a sensible precaution to take right now. Sure, most of my traffic is probably completely irrelevant when we look at the time in 20 years or so. But I can easily imagine a changed political landscape by then where something I do right now might get me in trouble in 20 years. Quantum Computers do work in principle Posted Mar 12, 2026 18:01 UTC (Thu) by nbecker (subscriber, #35200) [ Link ] (1 responses) But this still leaves that there have so far been very few algorithms that show quantum advantage, and even fewer that have practical important applications. So while you argue that someday QC should be feasible, and I don't disagree, we still have made limited progress on finding actual applications that show advantage. Quantum Computers do work in principle Posted Mar 12, 2026 19:40 UTC (Thu) by chris_se (subscriber, #99706) [ Link ] I don't disagree with you there, but the prospect of breaking RSA and similar things will cause at least certain people to continue at least some investment into this technology. Premature Posted Mar 12, 2026 17:41 UTC (Thu) by smitty_one_each (subscriber, #28989) [ Link ] You really need to get onboard with Quantum AI Room-Temperature Super-Conductive (QAIRTSC) Computing. Copyright © 2026, Eklektix, Inc. Comments and public postings are copyrighted by their creators. Linux is a registered trademark of Linus Torvalds