Quantum computers, capable of breaking the public-key cryptography that underpins the modern internet, do not exist.
Yet.
But that is all about to change.
So “not yet” is not the right frame. The relevant adversary does not need a quantum computer today. He only needs to be patient: to collect encrypted traffic now, store it, and decrypt it later, once the technology cooperates. “Harvest now, decrypt later” is not a speculative threat model but a rational strategy and almost certainly already happening. Any communication that must remain confidential for a decade or more is therefore a present-tense vulnerability. The threat is future; the organizational work is now, and most organizations are moving slowly.
The migration to post-quantum cryptography is the kind of story that resists being told. Its central drama is administrative. Its heroes work in standards bodies. Its most celebrated outcome will be the absence of any dramatic outcome: browsers connecting, messages delivered, apps syncing, signatures verified, billions of small transactions occurring without incident in a world where the mathematics underneath them has been quietly replaced, the way a crew replaces the cables of a bridge while traffic flows underneath, while commuters listen to the radio and think about dinner.
The engineers who get it right will not be celebrated.
This is what infrastructure looks like from the inside. You do not see it unless it fails.
In August 2024, the National Institute of Standards and Technology released the first principal post-quantum cryptography standards. NIST points toward a transition timeline that deprecates and ultimately removes quantum-vulnerable algorithms by 2035, with high-priority systems moving earlier. The U.K.’s National Cyber Security Centre gives milestone dates of 2028 for discovery and planning, 2031 for early migration of critical systems, and 2035 for completion. The NSA pushes national security systems harder, with 2027 expectations for new deployments and 2030 phase-out milestones. The bureaucratic infrastructure of transition is in place. The transition itself remains.
A comparison keeps surfacing in technical circles: Y2K. The comparison is usually made to suggest scale, urgency, or the difficulty of explaining a threat that has not yet materialized. The deeper resonance is structural. Y2K mobilized enormous institutional effort whose success could only be measured by the absence of disaster. If it worked, nothing happened. The public event was a non-event.
Post-quantum migration has the same shape. It is infrastructural prevention, not technological theater. The engineers who get it right will not be celebrated. They will simply not be blamed.
RELATED: Unless something changes fast, datacenters could crash the electrical grid
standret/Getty Images
Devil’s in the details
The new algorithms use more space than the old ones, and in cryptography, size is no minor concern. ML-KEM-768, used for key exchange, has a public key of 1,184 bytes and a ciphertext of 1,088 bytes. SLH-DSA, used for digital signatures, has a minimal signature of 7,856 bytes. For comparison, a classical elliptic-curve signature is typically under 100 bytes. These size increases change handshake packetization, certificate-chain behavior, hardware security module design, logging, and storage assumptions. Post-quantum cryptography is unlike earlier cryptographic updates because its properties force protocol redesign.
Meta’s internal TLS rollout provides what amounts to a field report from the transition. The company chose a hybrid design combining classical elliptic-curve exchange with post-quantum ML-KEM, preferred the higher-security 768-parameter version, but dropped to the smaller 512-parameter version in some internal cases because packet-size constraints and handshake latency were otherwise too costly. The company reported roughly a 40% increase in CPU cycles during early hybrid rollout and discovered a multi-threading bug in the underlying cryptographic library during deployment at scale. This is what migration looks like in practice: engineering trade-offs among latency, compatibility, and fault discovery, made under production conditions, with real consequences. The ML-KEM-768 client share was large enough to threaten the TLS packet budget, sometimes adding an extra network round trip. A major security transition can hinge on whether a cryptographic object still fits in one packet.
This migration rewards organizations that already know where their cryptography lives. Meta’s migration framework describes a maturity ladder moving from “PQ-Unaware” through “PQ-Aware,” “PQ-Ready,” and “PQ-Hardened” to “PQ-Enabled,” and the prerequisite for any rung above the first is a working cryptographic inventory. An organization that does not know which of its systems use RSA, where its certificates are stored, or what its hardware security modules support cannot migrate. The migration rewards institutions that already behave like maintainers of infrastructure rather than its consumers. NIST calls this broader capacity “crypto agility,” indicating institutional self-knowledge under conditions of future threat.
No comprehension, no consent
Cloudflare reports that well over 60% of human-generated TLS traffic to its network is already protected with hybrid ML-KEM. Apple has deployed iMessage PQ3 since iOS 17.4 and now enables quantum-secure TLS by default. Signal introduced the Sparse Post-Quantum Ratchet so that ongoing conversations, not just initial handshakes, gain post-quantum forward secrecy. Signal’s public explanation emphasizes that the user experience does not change.
This is the governing aesthetic of the entire transition. The strongest form of cryptography is the one whose complexity has been absorbed into protocol design so completely that ordinary users require neither comprehension nor consent. Security succeeds when it disappears into the ordinary path. The future of trust arrives as extra bytes, silently negotiated in a handshake no one watches.
The European Union agency for cybersecurity, ENISA, found in its 2025 survey that post-quantum adoption sits at roughly 2% in the space sector. The technical standards have outrun much of the institutional world that must absorb them. The destination is clear, but parts of the road are still being paved. Somewhere, in data centers that do not advertise their purposes, traffic is being stored against a future that the collectors cannot yet quite see.
Read the full article here


