“Requiring people to build a vulnerability may be a strategic mistake.”
Michael Chertoff, the former head of the Department of Homeland Security and a former federal prosecutor, made some surprising remarks last week, coming out strongly against cryptographic backdoors that could be provided to the government upon request.
“I think that it’s a mistake to require companies that are making hardware and software to build a duplicate key or a back door even if you hedge it with the notion that there’s going to be a court order,” he said to the crowd at the Aspen Security Forum.
This sentiment stands in contrast to what the FBI and other top government officials have said while lamenting the problem of “going dark”—the idea that criminals, ne’er-do-wells, and miscreants have access to more encryption than ever before, and that’s bad for law enforcement.
Effectively, some in government have called for such golden keys as a way for only authorized law enforcement to break strong crypto under certain legal circumstances. FBI Director James Comey made the case for such back-doors earlier this month before a Senate committee.
However, the tactic has been roundly lampooned in cryptography, privacy, and legal circles. Chertoff cogently provided many counterarguments to Comey and his allies, as Emptywheel pointed out:
First of all, there is, when you do require a duplicate key or some other form of back door, there is an increased risk and increased vulnerability. You can manage that to some extent. But it does prevent you from certain kinds of encryption. So you’re basically making things less secure for ordinary people.
The second thing is that the really bad people are going to find apps and tools that are going to allow them to encrypt everything without a back door. These apps are multiplying all the time. The idea that you’re going to be able to stop this, particularly given the global environment, I think is a pipe dream. So what would wind up happening is people who are legitimate actors will be taking somewhat less secure communications and the bad guys will still not be able to be decrypted.
The third thing is that what are we going to tell other countries? When other countries say great, we want to have a duplicate key too, with Beijing or in Moscow or someplace else? The companies are not going to have a principled basis to refuse to do that. So that’s going to be a strategic problem for us.
Finally, I guess I have a couple of overarching comments. One is we do not historically organize our society to make it maximally easy for law enforcement, even with court orders, to get information. We often make trade-offs and we make it more difficult. If that were not the case then why wouldn’t the government simply say all of these [takes out phone] have to be configured so they’re constantly recording everything that we say and do and then when you get a court order it gets turned over and we wind up convicting ourselves. So I don’t think socially we do that.
And I also think that experience shows we’re not quite as dark, sometimes, as we fear we are. In the 90s there was a—when encryption first became a big deal—debate about a Clipper Chip that would be embedded in devices or whatever your communications equipment was to allow court ordered interception. Congress ultimately and the President did not agree to that. And, from talking to people in the community afterwards, you know what? We collected more than ever. We found ways to deal with that issue.
So it’s a little bit of a long-winded answer. But I think on this one, strategically, we, requiring people to build a vulnerability may be a strategic mistake.
In addition to Chertoff’s argument, former National Security Agency head Gen. Michael Hayden also came out against backdoors.
“I hope Comey’s right, and there’s a deus ex machina that comes on stage in the fifth act and makes the problem go away,” Hayden, who now works for Chertoff and was at the same conference, told The Daily Beast. “If there isn’t, I think I come down on the side of industry. The downsides of a front or back door outweigh the very real public safety concerns.”
For his part, President Barack Obama seems somewhat sympathetic to these arguments, but his administration hasn’t clearly articulated a position. In February 2015, the president said he is a “believer in strong encryption” but also “sympathetic” to law enforcement’s needs to prevent terror attacks.