This article is more than 1 year old
The weapons pact threatening IT security research
We speak to infosec experts worried by treaty changes
Analysis The US government has rewritten chunks of an obscure weapons trade pact between itself, Europe, Russia, and other nations – a pact that is now casting its shadow over today's computer security tools.
Dubbed the Wassenaar Agreement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies, the treaty limits who can buy the really nasty and secret stuff that makes tanks, planes and ships so effective in combat.
Over the past decade, the agreement has been widened to include computer technology. The latest revision of the text, which is now up for discussion prior to approval, has people in the IT security industry severely worried.
The US Commerce Department, via its Bureau of Industry and Security (BIS), is proposing a blanket ban on the export of:
Software 'specially designed' or modified to avoid detection by 'monitoring tools,' or to defeat 'protective countermeasures,' of a computer or network-capable device, and performing any of the following:
(a) The extraction of data or information, from a computer or network-capable device, or the modification of system or user data; or (b) The modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions.
Taken in its broadest sense, it may cover a multitude of legitimate software tools, including basic things like antivirus packages, that cannot be exported without a government-approved license, if that's even possible.
And by exported, we mean, downloaded anywhere outside its country of origin, or installed on a laptop and taken abroad.
Security researchers are worried that the programming, debugging and reverse-engineering utilities they rely on will be export-controlled, preventing them from using the software unless a government grants them permission.
So what exactly has happened to cause these changes?
Good intentions backfire
The primary reason given for the changes is to stop repressive regimes around the world from buying sophisticated software that can be used to spy on political opponents and others.
This snoop-ware usually exploits security vulnerabilities in the targets' computers to silently and secretly install itself. Companies like Gamma International and the Italian-based Hacking Team will sell surveillance software to almost all comers.
The updated language tries to crack down on this trade of vulnerability-exploiting super-spyware; as a result, it puts a significant crimp in the sale and exchange of information about exploitable software security flaws.
The market for zero-day vulnerabilities can be a lucrative one; the new language bans the sale of details of unpatched flaws to anyone other than one's own government.
"There is a policy of presumptive denial for items that have or support rootkit or zero-day exploit capabilities," said Randy Wheeler, director of the information technology controls division of BIS, during a conference call discussing the new rules.
But as Nate Cardozo, staff attorney for the Electronic Frontier Foundation, explained to The Register, those rules could be taken to mean that tools that spot zero-day attacks and craft proof-of-concept code to prove that vulnerabilities can be exploited by bad guys. And while keeping spyware out of the hands of repressive regimes is admirable, the proposed rules go way too far, we're told.
"This was drafted by someone who doesn't understand security research and the effect of its implementation, not just on researchers but the general public as well: It's ludicrous," Cardozo said.
"These implementations will do nothing to keep surveillance software out of the hands of the worse actors and will harm those who are seeking to limit their use."
Under the terms of the agreement, if researchers wanted to use commercial software named by the US Department of Commerce, an export license is needed, he explained. This is an arduous process, and requires a huge amount of form filling and hiring an export specialist lawyer, and those folks don't come cheap.
That's not a problem if you're a defense giant like Boeing or Lockheed Martin, but if you're a single researcher then the costs are hideous, and will cause major problems. But there's an additional problem – you need to be a legal adult to do so.
Kids today, eh
"Security research is increasingly a young person's game," Katie Moussouris, chief policy officer for vulnerability disclosure specialists HackerOne, told El Reg.
"The soldiers we are enlisting in the security fight are under draftable age. Setting up further hoops for them to jump through will drive people into underground markets."
Moussouris is an expert in this field, having convinced Microsoft to set up a bug bounty program while employed at Redmond, and who now spends time analyzing the market for security researchers and vulnerabilities. She's convinced that the new rules will do more harm than good.
The new rules were devised in conjunction with defense contractors and privacy experts, each of whom have their own agendas. The security industry wants a level of regulation that will squeeze out smaller companies, she suggested, while privacy experts want the zero-day market shut down.
Chris Soghoian, principal technologist for the ACLU, would certainly be happy for the market in zero-days to be killed off, and – while he wasn't available for interview – appeared to defend the new rules and blame researchers for bringing this on themselves.
But Moussouris said that the new rules were going too far, and said her fears were that the US government is heading into a fight while not understanding the core issues. Other researchers in the field share her concern.
Jonathan Zdziarski, a prominent security expert who has trained law enforcement across the US in computer forensics, said that if the proposed Wassenaar rules were in place back in 2008, he wouldn't be in business right now.
"The tools and techniques I have developed are by no means 'intrusion' tools, however due to the excessively broad nature of the Wassenaar proposal, they would fall under its regulations as they bypass security mechanisms of devices and collect information from them," he commented.
"This proposal stands to only damage those looking to contribute to a better and more secure community. Wassenaar has a deterrent component, and at the heart of security research are many independent researchers like myself who will simply stop contributing if there is a fear of prosecution simply for sharing knowledge in the form of code." ®