The timeline of quantum computer threats is often exaggerated, and the risk of program vulnerabilities is still much greater than that of quantum attacks in the short term. Blockchains don’t need to rush to deploy post-quantum signatures, but planning should start immediately. This article is derived from an article written by Justin Thaler, compiled, compiled and written by Vernacular Blockchain. (Summary: Raoul Pal warns: If the Fed does not print money for QE, “liquidity will be shortage”, or a repeat of the 2018 financial crisis in the repo market) (Background supplement: The US will release its September non-farm payroll report next week, and the market is closely watching the impact of the Fed’s (Fed) interest rate cut) The timeline of cryptography-related quantum computers is often exaggerated - leading to the demand for an urgent and comprehensive transition to post-quantum cryptography. But these calls often ignore the costs and risks of premature migration and ignore the vastly different risk profiles between different cryptographic primitives: . Post-quantum encryption requires immediate deployment despite its costs: “Get it first, decrypt later” (Harvest-Now-Decrypt-Later, HNDL) attacks are already underway, as sensitive data encrypted today will still be valuable when quantum computers arrive, even if that’s decades from now. The performance overhead and implementation risks of post-quantum encryption are real, but HNDL attacks leave no choice for data that requires long-term confidentiality. Post-quantum signatures face different considerations. They are less vulnerable to HNDL attacks, and their cost and risk ( larger size, performance overhead, immature implementation, and bug ) require thoughtful consideration rather than immediate migration. These distinctions are crucial. Misconceptions can skew cost-benefit analysis, causing teams to overlook more prominent security risks—like programming errors (bugs). The real challenge in successfully transitioning to post-quantum cryptography lies in matching urgency with real threats. Below, I’ll shed light on common misconceptions about quantum’s threats to cryptography—covering cryptography, signatures, and zero-knowledge proofs—with a particular focus on their impact on blockchain. How is our timeline going? Despite the high-profile claims, the likelihood of a cryptography-related quantum computer (CRQC) in the 2020s is extremely low. By “cryptography-related quantum computers” I mean a fault-tolerant, error-correcting quantum computer capable of running the Shor algorithm at a scale sufficient to attack elliptic curve cryptography or RSA within a reasonable time frame ( for example, to crack {secp}256{k}1 or {RSA-2048} attacks on elliptic curve cryptography or RSA within a maximum of a month of continuous computation. Based on any reasonable interpretation of public milestones and resource estimates, we are still far from cryptographically related quantum computers. Companies sometimes claim that CRQC could appear before 2030 or well before 2035, but publicly known developments do not support these claims. For context, in all current architectures – imprisoned ions, superconducting qubits, and neutral atomic systems – today’s quantum computing platforms do not come close to running the hundreds of thousands to millions of physical qubits required to run the Shor algorithm attack {RSA-2048} or {secp}256{k}1 ( depending on the error rate and error correction scheme ). The limiting factors are not just the number of qubits, but also gate fidelity, qubit connectivity, and the depth of continuous error correction circuits required to run deep quantum algorithms. While some systems now have over 1,000 physical qubits, the original qubit count itself is misleading: these systems lack the qubit connectivity and gate fidelity required for cryptography-related calculations. Recent systems are close to the physical error rate at which quantum error correction comes into play, but no one has proven that more than a handful of logical qubits have continuous error correction circuit depth… Not to mention the thousands of high-fidelity, deep-circuit, fault-tolerant logic qubits needed to actually run the Shor algorithm. The gap between proving that quantum error correction is feasible in principle and the scale required to achieve cryptanalysis remains significant. In short: unless both the number of qubits and fidelity increase by several orders of magnitude, cryptography-related quantum computers are still out of reach. However, corporate press releases and media coverage can be confusing. Here are some common sources of misconception and confusion, including: Demos claiming “quantum advantage”, currently targeting human-designed tasks. These tasks were chosen not for their practicality, but because they could run on existing hardware while appearing to exhibit great quantum acceleration – a fact that is often blurred in announcements. The company claims to have achieved thousands of physical qubits. But this refers to quantum annealing machines, not gate model machines needed to run the Shor algorithm to attack public-key cryptography. The company freely uses the term “logical qubits”. Physical qubits are noisy. As mentioned earlier, quantum algorithms require logical qubits; The Shor algorithm requires thousands. With quantum error correction, a logical qubit can be implemented with many physical qubits – usually hundreds to thousands, depending on the error rate. But some companies have extended the term beyond recognition. For example, a recent announcement claims to use a distance of 2 yards and implement a logical qubit with only two physical qubits. This is ridiculous: a distance of 2 yards only detects errors, not corrects them. Truly fault-tolerant logical qubits for cryptanalysis require hundreds to thousands of physical qubits each, not two. More generally, many quantum computing roadmaps use the term “logical qubits” to refer to qubits that only support Clifford operations. These operations can be efficiently performed for classical simulations and are therefore not sufficient to run the Shor algorithm, which requires thousands of error-corrected T-gates ( or more general non-Clifford gate ). Even if one of the roadmaps aims to “achieve thousands of logical qubits in year X,” that doesn’t mean the company expects to run the Shor algorithm to crack classical cryptography in the same year X. These practices have severely distorted the public’s perception of how close we are to cryptographically related quantum computers, even among established observers. That said, some experts are really excited about the progress. For example, Scott Aaronson recently wrote that given the “current staggering speed of hardware development,” I now believe that it is a realistic possibility that we will have a fault-tolerant quantum computer running the Shor algorithm before the next US presidential election. But Aaronson later clarified that his statement does not mean cryptography-related quantum computers: he argues that even if a fully fault-tolerant Shor algorithm runs a factoring of 15 = 3 imes 5, it counts as an implementation - and this calculation can be done much faster with pencil and paper. The standard is still to execute the Shor algorithm on a small scale, not on a cryptography-related scale, as previous experiments with factoring 15 on quantum computers used simplified circuitry instead of a full, fault-tolerant Shor algorithm. And there’s a reason these experiments always factored down the number 15: arithmetic computations for modulo 15 are easy, while factoring down slightly larger numbers like 21 is much harder. Therefore, quantum experiments claiming to break down 21 often rely on additional hints or shortcuts. In short, the expectation of a cryptography-related quantum computer capable of cracking {RSA-2048} or {secp}256{k}1 in the next 5 years - this is crucial to actual cryptography…
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
In-depth analysis: Are we too afraid of the cryptographic security threats posed by quantum computers?
The timeline of quantum computer threats is often exaggerated, and the risk of program vulnerabilities is still much greater than that of quantum attacks in the short term. Blockchains don’t need to rush to deploy post-quantum signatures, but planning should start immediately. This article is derived from an article written by Justin Thaler, compiled, compiled and written by Vernacular Blockchain. (Summary: Raoul Pal warns: If the Fed does not print money for QE, “liquidity will be shortage”, or a repeat of the 2018 financial crisis in the repo market) (Background supplement: The US will release its September non-farm payroll report next week, and the market is closely watching the impact of the Fed’s (Fed) interest rate cut) The timeline of cryptography-related quantum computers is often exaggerated - leading to the demand for an urgent and comprehensive transition to post-quantum cryptography. But these calls often ignore the costs and risks of premature migration and ignore the vastly different risk profiles between different cryptographic primitives: . Post-quantum encryption requires immediate deployment despite its costs: “Get it first, decrypt later” (Harvest-Now-Decrypt-Later, HNDL) attacks are already underway, as sensitive data encrypted today will still be valuable when quantum computers arrive, even if that’s decades from now. The performance overhead and implementation risks of post-quantum encryption are real, but HNDL attacks leave no choice for data that requires long-term confidentiality. Post-quantum signatures face different considerations. They are less vulnerable to HNDL attacks, and their cost and risk ( larger size, performance overhead, immature implementation, and bug ) require thoughtful consideration rather than immediate migration. These distinctions are crucial. Misconceptions can skew cost-benefit analysis, causing teams to overlook more prominent security risks—like programming errors (bugs). The real challenge in successfully transitioning to post-quantum cryptography lies in matching urgency with real threats. Below, I’ll shed light on common misconceptions about quantum’s threats to cryptography—covering cryptography, signatures, and zero-knowledge proofs—with a particular focus on their impact on blockchain. How is our timeline going? Despite the high-profile claims, the likelihood of a cryptography-related quantum computer (CRQC) in the 2020s is extremely low. By “cryptography-related quantum computers” I mean a fault-tolerant, error-correcting quantum computer capable of running the Shor algorithm at a scale sufficient to attack elliptic curve cryptography or RSA within a reasonable time frame ( for example, to crack {secp}256{k}1 or {RSA-2048} attacks on elliptic curve cryptography or RSA within a maximum of a month of continuous computation. Based on any reasonable interpretation of public milestones and resource estimates, we are still far from cryptographically related quantum computers. Companies sometimes claim that CRQC could appear before 2030 or well before 2035, but publicly known developments do not support these claims. For context, in all current architectures – imprisoned ions, superconducting qubits, and neutral atomic systems – today’s quantum computing platforms do not come close to running the hundreds of thousands to millions of physical qubits required to run the Shor algorithm attack {RSA-2048} or {secp}256{k}1 ( depending on the error rate and error correction scheme ). The limiting factors are not just the number of qubits, but also gate fidelity, qubit connectivity, and the depth of continuous error correction circuits required to run deep quantum algorithms. While some systems now have over 1,000 physical qubits, the original qubit count itself is misleading: these systems lack the qubit connectivity and gate fidelity required for cryptography-related calculations. Recent systems are close to the physical error rate at which quantum error correction comes into play, but no one has proven that more than a handful of logical qubits have continuous error correction circuit depth… Not to mention the thousands of high-fidelity, deep-circuit, fault-tolerant logic qubits needed to actually run the Shor algorithm. The gap between proving that quantum error correction is feasible in principle and the scale required to achieve cryptanalysis remains significant. In short: unless both the number of qubits and fidelity increase by several orders of magnitude, cryptography-related quantum computers are still out of reach. However, corporate press releases and media coverage can be confusing. Here are some common sources of misconception and confusion, including: Demos claiming “quantum advantage”, currently targeting human-designed tasks. These tasks were chosen not for their practicality, but because they could run on existing hardware while appearing to exhibit great quantum acceleration – a fact that is often blurred in announcements. The company claims to have achieved thousands of physical qubits. But this refers to quantum annealing machines, not gate model machines needed to run the Shor algorithm to attack public-key cryptography. The company freely uses the term “logical qubits”. Physical qubits are noisy. As mentioned earlier, quantum algorithms require logical qubits; The Shor algorithm requires thousands. With quantum error correction, a logical qubit can be implemented with many physical qubits – usually hundreds to thousands, depending on the error rate. But some companies have extended the term beyond recognition. For example, a recent announcement claims to use a distance of 2 yards and implement a logical qubit with only two physical qubits. This is ridiculous: a distance of 2 yards only detects errors, not corrects them. Truly fault-tolerant logical qubits for cryptanalysis require hundreds to thousands of physical qubits each, not two. More generally, many quantum computing roadmaps use the term “logical qubits” to refer to qubits that only support Clifford operations. These operations can be efficiently performed for classical simulations and are therefore not sufficient to run the Shor algorithm, which requires thousands of error-corrected T-gates ( or more general non-Clifford gate ). Even if one of the roadmaps aims to “achieve thousands of logical qubits in year X,” that doesn’t mean the company expects to run the Shor algorithm to crack classical cryptography in the same year X. These practices have severely distorted the public’s perception of how close we are to cryptographically related quantum computers, even among established observers. That said, some experts are really excited about the progress. For example, Scott Aaronson recently wrote that given the “current staggering speed of hardware development,” I now believe that it is a realistic possibility that we will have a fault-tolerant quantum computer running the Shor algorithm before the next US presidential election. But Aaronson later clarified that his statement does not mean cryptography-related quantum computers: he argues that even if a fully fault-tolerant Shor algorithm runs a factoring of 15 = 3 imes 5, it counts as an implementation - and this calculation can be done much faster with pencil and paper. The standard is still to execute the Shor algorithm on a small scale, not on a cryptography-related scale, as previous experiments with factoring 15 on quantum computers used simplified circuitry instead of a full, fault-tolerant Shor algorithm. And there’s a reason these experiments always factored down the number 15: arithmetic computations for modulo 15 are easy, while factoring down slightly larger numbers like 21 is much harder. Therefore, quantum experiments claiming to break down 21 often rely on additional hints or shortcuts. In short, the expectation of a cryptography-related quantum computer capable of cracking {RSA-2048} or {secp}256{k}1 in the next 5 years - this is crucial to actual cryptography…