SIKE attack: IoT cryptography in the quantum age

On July 30th, a paper was published revealing a devastating attack on the Post-Quantum Algorithm (PQA) SIKE. The algorithm was well-respected as a candidate for further study in the fourth round of the NIST PQA process. It was particularly favoured by large technology companies such as AWS, Cloudflare and Google who were experimenting with and deploying SIKE on their cloud infrastructure. The attack showed that keys for the most secure instances of SIKE could be recovered in under a day on a single core, says Daniel Shiu, chief cryptographer at Arqit.

This was the third piece of cryptanalysis in 2022 that has caused us to reassess the security of NIST candidates (after Buellens’ “Breaking Rainbow takes a weekend on a laptop” and the Israeli Defence Force’s analysis of lattice algorithms such as KYBER, Saber, and Dilithium). This boom in cryptanalysis (including earlier results on GeMMS and SPHINCS+) might be expected as the list of candidates thins out and more attention is focussed on the survivors, but it is a sign that five years after the process was started, we still do not have a mature understanding of the security of PQAs. In the past, similar experiences with Public Key Cryptography (PKC) have seen cryptographers scrambling to increase keys sizes that were supposed to have been secure for millions of years,

dealing with legacy insecure systems and vulnerable to downgrading cyberattacks such as DROWN and LOGJAM. Why then are we rushing to this redesign of Internet cryptography?

The wrong answer to the wrong question

The NIST PQA process arises from a desire for “drop-in” replacements for existing key establishment and authentication methods on the Internet. This admirable goal of migrating to quantum-safe cryptography with the minimum disruption to users is proving hard to achieve. The PQA methods have greater resource requirements than their classical predecessors, whether in terms of bandwidth, computation, size of codebase or combinations of these.

The existing Internet protocols are proving to be a poor fit for the new algorithms, so that major changes to Internet communication are being considered. The SIKE attack is particularly painful from this point of view as it was the PQA with the best bandwidth properties by far. Beyond the world of standards, there are also the challenges of securely implementing the complex new ideas in code, of integrating that code into products and moving users away from legacy products. Major projects to manage the transition over the coming decades are now being started.

All this effort is to keep as close as possible to the dream of drop-in replacement, but it is worth taking a step back and considering why the existing methods were first adopted and how they are already evolving due to a changing Internet. PKC was introduced to the Internet of the early 90s, when much of the traffic was still based around enterprise mainframes, connectivity was sporadic, and services might be unavailable for days or weeks at a time.

The Public Key Infrastructure (PKI) of certificates allowed key establishment and authentication to be passively mediated offline so that Certifying Authorities did not have to be constantly available. Since then, the Internet has evolved through the AOL era of PCs and dial up, the laptop and Wi-Fi era, the smart phone and server farm era and is now moving towards the Internet of things. The core Internet (a concept that was meaningless in the 90s) assures us of very reliable connectivity and availability, while lightweight edge devices (another 21st century Internet concept) push us towards reducing bandwidth, computational burden, and memory footprint.

Major Internet service providers are most acutely aware of these changes and have been influencing a reduction in the use of PKC, using federated authentication methods (such as when we are invited to sign in using Google or Facebook) and session tickets to refresh keys without using PKC. The TLS1.3 standard is especially forward looking in compatibility with keys established without PKC. These methods are also quantum safe when founded in the well-established and assured methods of symmetric cryptography.

Stronger, simpler encryption

Symmetric cryptography is more robust than PKC and also massively more efficient in terms of bandwidth, computation, and memory footprint. These desiderata for the smartphone and server farm era of the Internet become critical requirements when we start to think of IoT. Now, although primarily thought of as a means of bulk encryption requiring shared secret keys, symmetric cryptography can also provide authentication using Hashed Message Authentication Codes (HMACs) and key establishment using methods such as Kerberos.

Symmetric key establishment was not thought a good solution to the unreliable Internet of the 90s due to the need for reliable connectivity to and availability of a Key Distribution Centre. In the 21st century, these concerns go away. In particular, the era of Cloud computing allows us to decentralise Key Management Services and have extremely high availability and reduced latency at a global scale. Methods can be added to split trust away from the mediation services and provide end-to-end security. The methods of symmetric key establishment are robust, efficient, quantum-safe and integrate well with existing standards such as TLS and IPsec.

Demonstrations with smart cities and unmanned vehicles show that systems can be transitioned to these methods today, quickly and with minimal disruption to existing services and users. New IoT deployments become much easier to roll out using MQTT without the burden of PKI. The savings on energy consumption and memory requirements drive down costs and extend lifetimes of devices. These benefits in turn broaden use cases and increase the value of IoT approaches. All that is needed is the willingness to move away from the misplaced and unachievable dream of finding a new, drop-in way of solving the problems of the Internet of the 90s.

The author is Daniel Shiu, chief cryptographer at Arqit.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow.

RECENT ARTICLES

Aeris to acquire IoT business from Ericsson

Posted on: December 8, 2022

Ericsson and Aeris Communications, a provider of Internet of Things (IoT) solutions based in San Jose, California, have signed an agreement for the transfer of Ericsson’s IoT Accelerator and Connected Vehicle Cloud businesses.

Read more

Telenor IoT passes milestone of 20mn SIM cards

Posted on: December 8, 2022

Telenor, the global IoT provider and telecom operator, has experienced rapid growth over the last years and ranks among the top 3 IoT operators in Europe and among the top IoT operators in the world. The positive development is due to an accelerated pace of new customers combined with a successful growth of existing customers’

Read more
FEATURED IoT STORIES

The IoT Adoption Boom – Everything You Need to Know

Posted on: September 28, 2022

In an age when we seem to go through technology boom after technology boom, it’s hard to imagine one sticking out. However, IoT adoption, or the Internet of Things adoption, is leading the charge to dominate the next decade’s discussion around business IT. Below, we’ll discuss the current boom, what’s driving it, where it’s going,

Read more

Talking Heads: The M2M Doctor is in the House

Posted on: December 26, 2013

Mobile health is M2M at its most rewarding. So says, Dan MacDuffie CEO of Wyless (left). And he should know, his managed services company has achieved 50% yearon- year growth recently and a growing portion of that is in mHealth and Wellness services. He’s certain we’re standing on the threshold of a new generation of health services that cut delivery costs, extend the reach

Read more

Talking Heads: mHealth gains ground as one-stop shops and M2M with ‘wired safety net’ bring efficient patient monitoring

Posted on: December 23, 2013

For years analysts have touted mobile healthcare as a huge opportunity for those offering machine-to-machine communication (M2M) services. Truth be told, the progress so far has been patchy, at best. So M2M Now asked Alexander Bufalino, SEVP Global Marketing at Telit, to describe the hurdles in the way of M2M mHealth, how they are now being overcome and what

Read more

Unlocking the total value of M2M

Posted on: December 19, 2013

Do you ever wonder why people and organisations invest in machine-to-machine communications (M2M) and the Internet of Things (IoT), asks Fred Yentz? Reasons may differ somewhat across industry segments but in most cases they fall in one or more of three categories: To make money, to save money or to be compliant. ILS Technology is squarely focused on helping

Read more

Paving the way to the Internet of Things

Posted on: December 17, 2013

Combining the ARM computing engine with location-awareness and wireless connectivity It’s set to be the Perfect Storm: The rapid growth of high-speed cellular networks and the introduction of IP version 6 which has enough IP addresses for every grain of sand on Earth. Add to this mix the proliferation of the ARM embedded computing architecture, now the de facto global

Read more

What’s the ‘real deal’ on the Internet of Things?

Posted on: December 16, 2013

The ‘Internet of Things’ buzzword appears to have picked up steam during the past several months as large players such as GE and Cisco have touted their stories on the growing number of connected devices. But, as Alex Brisbourne of KORE asks, how different, if at all, is the Internet of Things when compared with other connected device markets,

Read more

M2M Now Magazine December 2013 Edition

Posted on: December 5, 2013

M2M Now magazine explores the evolving opportunities and challenges facing CSPs across this sector. Our exclusive interviews pass on some key lessons learned by those who have taken the first steps in next gen Machine to Machine (M2M) services. In the latest issue: TALKING HEADS: Alexander Bufalino of Telit tells how one-stop shops and M2M with a ‘wired

Read more