White Papers
This white paper examines the critical shift toward post-quantum cryptography (PQC) in embedded systems, focusing on Key Encapsulation Mechanisms (KEMs) as the core replacement for traditional Diffie–Hellman-based key exchanges. With Cryptographically Relevant Quantum Computers (CRQCs) potentially emerging within 16 years, classical algorithms like RSA and ECC face obsolescence under Shor’s algorithm. While PQC migration already poses challenges in general-purpose computing, embedded contexts introduce unique constraints related to performance, memory usage, and resilience to physical attacks—making effective KEM deployment both urgent and complex.
The paper begins by outlining the motivation behind transitioning to KEM-based protocols. Standard RSA- or ECC-driven public key exchanges rely on mathematical structures vulnerable to quantum threats, prompting bodies like NIST to reconsider cryptographic primitives. However, practical migration is nontrivial. Embedded systems must contend with larger key sizes, increased latency, and stringent performance requirements, all of which demand tailored solutions not typically encountered in software-dominated environments.
Central to this discussion is the distinction between classical key exchange and KEMs. Whereas Diffie–Hellman relies on both parties generating key pairs and arriving at the shared secret symmetrically, a KEM shifts key generation and key usage roles into separate encapsulation and decapsulation functions—crucially altering protocol design. The paper’s spotlight on ML KEM (standardized by NIST) illustrates how this lattice-based KEM departs from conventional approaches. ML-KEM, derived from CRYSTALS-KYBER, applies the Fujisaki–Okamoto transform to an underlying public key encryption algorithm. It thus achieves high-speed performance but, like most post-quantum schemes, comes with significantly larger key and ciphertext sizes than classical protocols—creating potential bottlenecks for embedded devices.
The path toward standardization remains dynamic. NIST initially approved ML KEM and certain digital signature algorithms (e.g., CRYSTALS-Dilithium) in 2022. Other alternatives—HQC, FrodoKEM, and Classic McEliece—are under continuing scrutiny at both national and international levels. Each brings distinct performance and security trade-offs, emphasizing that cryptographic agility is imperative. Regulatory divergence across regions further complicates adoption, urging organizations to maintain multiple post-quantum options to satisfy heterogeneous compliance requirements.
In parallel, the paper underscores the role of hybrid cryptography in bridging current and future needs. By fusing classical elliptic curve methods (e.g., X25519) with ML-KEM, adopters achieve redundancy against unanticipated weaknesses. However, the U.S. CNSA 2.0 guidance notably departs from this trend by disallowing hybrids for certain high-security use cases, instead mandating ML-KEM-1024. Despite these divergent stances, the consensus among many standards agencies is that hybrids can mitigate immediate vulnerabilities while allowing time for more thorough post-quantum testing and refinement.
Security in embedded environments extends beyond algorithmic robustness to physical resistance, recognized as a principal challenge for ML KEM. Threat actors with direct hardware access may exploit power consumption, electromagnetic emissions, or even intentional faults to recover secret material. Because PQC procedures like ML-KEM involve multiple sensitive steps—ranging from polynomial transformations (NTT) to hash-based checks—broad areas of compute-time leakage can be targeted. Side-channel and fault injection research on ML-KEM reveals vulnerabilities requiring extensive masking, shuffling, and constant-time coding strategies. Even then, compiler optimizations, memory footprints, and ephemeral vs. static key usage can introduce subtle flaws. The paper’s emphasis on thorough, repeated testing and the integration of hardware accelerators for hash functions speaks to the breadth of design changes demanded by post-quantum migration.
Finally, the paper highlights the role of Keysight in supporting secure PQC development. Through specialized lab services, testing frameworks, and collaborations with industry and academia, organizations can evaluate how next-generation algorithms withstand side-channel and fault-based attacks. Tools such as Keysight’s Inspector platform offer both pre-silicon and post-silicon assessment, ensuring cryptographic vulnerabilities are caught and mitigated early. As regulatory bodies move toward EU Common Criteria or similar certifications for post-quantum algorithms, this proactive testing capacity is increasingly indispensable.
Overall, the paper conveys that while ML-KEM is the leading candidate to replace classical key exchange, implementing it in resource-constrained, security-critical embedded systems is far from trivial. Physical considerations, protocol revisions, and the need for agility in algorithm choice collectively require a comprehensive approach. Early and rigorous testing, hybrid deployments, and a firm grasp of KEM’s unique properties constitute vital steps for any organization seeking to remain technologically competitive and securely prepared for the quantum era.
What are you looking for?