This article is more than 1 year old

Kiss goodbye to privacy forever when brain-implanted comms gear becomes the norm – guru Whit Diffie

Top cryptographers open fire on AI, quantum computing, NFTs, and more

RSAC Top cryptographers – including Ron Rivest and Adi Shamir, the R and the S in RSA – on Monday played down the impact of AI and quantum computing, shrugged off NFTs, and responded to the development of a mathematical technique that allegedly thwarts today's public key encryption.

One of the experts even warned we're a generation away from totally destroying our privacy with brain-connected communications devices.

RSA encryption

That aforementioned mathematical technique was documented in a non-peer-reviewed paper by respected German cryptographer Claus-Peter Schnorr that emerged in March and has undergone revisions since. He claimed he had devised a method to find the prime factors p and q of an RSA modulus n far faster than rival algorithms, boasting in one draft: "This destroys the RSA cryptosystem."

Reader, it will not.

Speaking at the RSA Conference's Crytographers' Panel, Rivest said he'd contacted Schnorr and others as soon as he heard about the paper, pointed out some possible issues, and reckons we'll have to wait "until the dust settles" on whether the technique works as stated.

"I tend to be skeptical and the proof is in the pudding when it comes to factoring – I want to see numbers get factored," Rivest said. "Factoring has a very important property that you can demonstrate that you can factor without needing to reveal how. You can factor some of the challenge numbers and give people notice if it works."

Shamir added that, while Schnorr's proposed technique does seem to speed up the factorization of numbers, it appears the speed increase is less than what the paper promises, and wouldn't be effective against the large primes in use today by cryptosystems like RSA.

NFT == NBD

Another thing that drew fire was NFTs, which Rivest said were "a bit like homeopathic medicine" in that there's not really a lot to them: the blockchain tokens merely point to somewhere you can download images and other files, rather than contain the actual media. Shamir was more upbeat: "Certainly it's not harmful. Some people collect coins, some people collect stamps, some people collect NFTs. If they want to pay money for this, it's fine with me."

Quantum money grab

Both were skeptical of quantum computers defeating encryption though the most trenchant criticism of the technology came from Prof Ross Anderson of England's University of Cambridge.

Physicists saw the pile of money poured into researching encryption and wanted the same for quantum mechanics, he said, adding it was a way "for number theorists to get their shovels into the military budget." The research has been useful for developing quantum sensing devices, though he was "entirely unimpressed" with such systems' cryptographic skills and doubted they would ever work as decryption engines.

AI too easy to hoodwink

He was similarly unimpressed with machine-learning-powered computer security tools, saying they should be easy to confuse and defeat, judging by his research on natural-language processing systems. Such systems have to be very finely tuned to work properly, and introducing certain data and variables can often "send then haywire." An adversary could find ways to feed bad packets into an AI network scanner, say, to either fool it into allowing malicious traffic through – or denying all traffic, shutting down connectivity. We've written a lot about these sorts of scenarios.

Rivest agreed, reminding us that complexity is the enemy of security, and such AI systems were very complex indeed.

Carmela Troncoso, head of the Security and Privacy Engineering Laboratory at École Polytechnique Fédérale de Lausanne in Switzerland, also agreed, pointing out that not only was resilience an issue but questioning whether it was possible to build a machine-learning system that was explainable, fair, and privacy preserving as well.

Privacy will be a myth

This last factor was also on Whitfield Diffie's mind. The industry guru, co-creator of the Diffie–Hellman key exchange, appeared in a Q&A at the end of the panel session and predicted that privacy, as we know it, will be gone in a generation when communications devices implanted in people's bodies become the norm.

I saw no way that human freedom could stand against improving communications

"For a long time I've been saying that I saw no way that human freedom could stand against improving communications," he opined.

"And I doubt we're a decade from a bunch of early adopters getting communicators put in their head and you won't have to force people to get them because you won't be competitive without it. The freedom we now enjoy will become very hard to come by and we'll think 'oh gosh, back in those days when we actually had privacy'".

As to what Diffie thought was the biggest security threat, his answer was simple: "Companies." ®

More about

TIP US OFF

Send us news


Other stories you might like