Interesting People mailing list archives

Re What if Responsible Encryption Back-Doors Were Possible? - Lawfare


From: "Dave Farber" <farber () gmail com>
Date: Sun, 2 Dec 2018 23:56:34 +0900




Begin forwarded message:

From: Tom Glocer <tom () glocer com>
Date: December 2, 2018 at 10:27:52 PM GMT+9
To: "dave () farber net" <dave () farber net>
Subject: Re: [IP] Re What if Responsible Encryption Back-Doors Were Possible? - Lawfare

Dave 

This debate reminded me of a long piece I posted to my blog in 2016 on the Apple IoS security debate

Regards
Tom

http://www.tomglocer.com/2016/03/03/apple-back-doors-and-the-city-on-a-hill/

Apple, Back Doors and the City on a Hill
Bad facts make bad law. This well-worn legal aphorism may well describe the state of American privacy law if the FBI 
is successful in its bid to compel Apple to write a special version of its iPhone operating system to provide a “back 
door” into one of its legacy devices.
 
Ostensibly, the question presented is whether Apple can refuse a court order to assist the FBI in gathering evidence 
from one specific phone provided by the employer of a dead terrorist. However, the potential precedential 
significance of the case is far greater.
 
The late Syed Rizwan Farook, together with his wife, Tashfeen Malik, killed 14 people and wounded 22 others in San 
Bernardino, California before meeting a swift and just end. As part of its investigation, the Federal Bureau of 
Investigation sought and obtained an court order under the All Writs Act of 1789 ordering Apple to write a special 
version of its iOS to defeat two security features included in the operating system of Farook’s legacy 5c iPhone. In 
its request, the FBI was at pains to stress the supposedly limited nature of the cooperation sought: just a little 
bit of code to be downloaded one time on a single out-of-date phone owned by the government employer of a dead 
terrorist-murderer. How could that be unreasonable?
 
In this case I believe it is, but the decision is neither an easy one nor an absolute one for all cases. Rather, as I 
argue below, these types of requests by law enforcement should be judged on a reasonableness standard based on their 
unique facts and circumstances, with personal privacy and freedom of communication given a strong but rebuttable 
presumption.
 
The Farouk case is just one of many requests the FBI has made to Apple to assist it in recovering information 
believed to be stored on its phones or in the iCloud. The FBI has chosen well to litigate this matter as the Farook 
set of facts makes an attractive test case for them. While Apple has generally complied with such requests in the 
past, it has refused in this instance because it argues that in balancing public safety versus our right to privacy, 
acceding to the FBI request in this case would set a dangerous precedent. In particular, Apple argues that it will be 
impossible to limit the “back door” the FBI seeks to only unlock the late terrorist’s phone without jeopardizing 
others. To understand why requires some additional background on Apple technology and criminal procedure.
 
What the FBI is asking Apple to do in the Farook case is to create a special version of the iOS (it has been dubbed 
“govtOS”) that once installed on Farook’s 5c will (i) allow an unlimited number of attempts to enter the four-digit 
passcode and (ii) eliminate the artificial and progressive delay that the standard iOS introduces after repeated 
failed attempts. This will allow the FBI to use what is known as a “brute force” technique to try up to all of the 
10,000 possible four-key combinations needed to unlock the phone.
 
The FBI analogizes its request to serving a lawful search warrant on a landlord to gain access to a suspect’s 
apartment with a master key. I think a more apt analogy would be a request to the landlord to send a team of its 
handymen to replace an existing wall of the suspect’s apartment with a new fake wall that includes a secret doorway. 
The extraordinary nature of the request consists in forcing the recipient to build something it would not otherwise 
undertake and which, in fact, it believes would render the apartment (or phone) insecure. The secret doorway like the 
special iPhone back door would permit anyone less trustworthy than the FBI to access the contents of the 
apartment/phone.
 
As for the criminal procedure context, it is important to understand what would happen if and when the FBI gained 
access to Farook’s phone. In particular, to determine whether the FBI would be able to keep its promise that the 
special back door created for the Farook case would never become public or fall into dangerous hands. Since both 
prime suspects are dead, investigators probably want the phone to discover if there are accomplices who could be 
identified via information currently encrypted on the phone or if there is other information that would be useful in 
preventing future attacks or adding to our understanding of terrorist networks. There is no suggestion, however, that 
there exists a clear and present danger of another specific attack or lives in danger that makes access to the 
contents of Farook’s phone critical. This would, like in the First Amendment free speech context from which I borrow 
this talismanic words, convince me to overcome my presumption in favor of personal privacy and agree to the FBI 
request. Absent such an extreme threat, I assume the investigation might play out on the following lines: I imagine 
that via access to Farook’s phone, the FBI could discover, arrest and charge one or more other conspirators. What 
would happen next? The defense lawyers and their forensic experts would demand access to the rogue govtOS code to 
dispute that the system worked as alleged to implicate their client.  For example, they would be free to argue that 
the name of their client was inserted by the FBI via the special govtOS code and not present in the original memory. 
Thus, at least to me, it does not sound farfetched that the back door code would then be released and pass through 
many hands, exposing us all to cyber insecurity.
 
While we are on the subject of the law, I find it ironic that all of the candidates seeking the 2016 Republican 
nomination extol their steadfast dedication to the Second Amendment to the US Constitution (conferring a right to 
bear arms), but rush quickly to condemn Apple for refusing to comply with the magistrate’s ex parte order to build a 
back door into the iPhone. As I understand it, a major tenet of the conservatives’ insistence on the right to carry 
concealed weapons or automatic rifles is that this will protect Americans from the potential tyranny of their 
government. However, if my goal is protecting liberty, I find it far more effective to protect our right of free 
speech and freedom of assembly by allowing the use of encrypted communications than maintaining an arsenal at home 
that I can use against a drone-equipped national military. Hence my preference for a rebuttable presumption in favor 
of privacy, just as I support the right of Americans to own guns, just not AR 15 automatic weapons – Constitutional 
rights should not be absolute when they bump up against each other.
 
Finally, I believe there is one additional and decisive reason to support Apple in its refusal to create even 
“limited” back doors. The Internet is a great force for freedom and democracy based on its ability to connect us, 
remove friction and rapidly disseminate information. Like most technologies, it is also “dual use” and can facilitate 
terrorist coordination and cybercrime. No matter how noble the motives of the FBI in seeking to crack Syed Farook’s 
iPhone, repressive regimes around the world are watching and will happily order Apple and other American technology 
companies to write potentially more dangerous and intrusive “limited” instruments to facilitate their law enforcement 
efforts. They may do so even if US courts ultimately support Apple’s position; however, the United States should not 
abandon the moral high ground.
 
The City on a Hill should not leave the back door open.
 

Published March 3, 2016, at 9:32 pm by tomglocer
Comment 

On Sat, Dec 1, 2018 at 10:42 PM Dave Farber <farber () gmail com> wrote:



Begin forwarded message:

From: Herb Lin <herblin () stanford edu>
Date: December 2, 2018 15:25:49 JST
To: "dave () farber net" <dave () farber net>, ip <ip () listbox com>
Cc: "ross.stapletongray () gmail com" <ross.stapletongray () gmail com>
Subject: RE: [IP] Re What if Responsible Encryption Back-Doors Were Possible? - Lawfare

Here’s the industry event to which Ross refers (https://youtu.be/tH7pHJAO1t8).  Ross is right that I said some 
things that were said during the Clipper debate.  That’s because some of the things said in favor of Clipper were 
valid.  That doesn’t mean that Clipper was a good idea. 

 

If someone wants to challenge me on something specific that I said during that talk, I’m happy to engage in that 
discussion.  That includes Ross, by the way.

 

The short version of what I said – or what I was trying to say, in any case—was that the technical debate is over 
as far as I am concerned – I fully accept the conclusion that it is impossible to develop an encryption system 
with exceptional access that is as secure as one without it.   But the advocates of responsible encryption are 
asking for something else—they are asking for the most secure system possible subject to the constraint that 
exceptional access is possible.  Whatever system comes out of that process *will* be less secure than what is 
possible without exceptional access.

 

Whether the diminished security is or is not worth the benefits to law enforcement is a policy question, not a 
technical question.  Advocates of exceptional access say “yes”, privacy advocates say “no.”  Both are reasonable 
answers, but neither should pretend that their judgments are technically based—they are policy judgments.  For 
myself, I note that policy judgments – unlike technical conclusions – are necessarily made in the particular 
societal and political circumstances extant at the moment of that judgment, and so anyone making a policy 
judgment ought to take those circumstances into account.

 

I confess to being surprised at Ross’s assertion that I am “fine with the potential to arm fascists in the 
information age,” which is as close to an ad hominem attack as I’ve ever heard him make on me or anyone else.  If 
intellectual honesty is part of the that potential, then I regret that I have to plead guilty.  But by the same 
token, I think that anyone who works to develop better information technology also has to plead guilty, since 
it’s impossible to make information technology unusable by fascists.

 

Herb

 

 

=======================================================================

Herb Lin

Senior Research Scholar, Center for International Security and Cooperation

Hank J. Holland Fellow in Cyber Policy and Security, Hoover Institution

Stanford University

Stanford, CA  94305  USA

herblin () stanford edu

Twitter @HerbLinCyber

 

From: Dave Farber <farber () gmail com> 
Sent: December 1, 2018 7:27 PM
To: ip <ip () listbox com>
Subject: [IP] Re What if Responsible Encryption Back-Doors Were Possible? - Lawfare

 




Begin forwarded message:

From: Ross Stapleton-Gray <ross.stapletongray () gmail com>
Date: December 2, 2018 10:46:39 JST
To: DAVID FARBER <dave () farber net>
Subject: Re: [IP] Re What if Responsible Encryption Back-Doors Were Possible? - Lawfare

On Sat, Dec 1, 2018 at 5:38 PM Dave Farber <farber () gmail com> wrote:

Haven’t we been around this idea many many times like Clipper chip etc

Is there no memory in the system?

 

We have been. There's just a dogged persistence among those who would like the first-order job of the government 
knowing things for control to be easy. I heard Herb Lin speak on this at an industry event, and it was like 
Clipper all over again. Stu Baker similarly. I'm not exactly sure what drives either, as Baker hasn't been 
working for the NSA for decades, and Herb is at Stanford. But both are fine with the potential to arm fascists in 
the information age.

 

Meanwhile, I'm back looking for work, as Rocket Lawyer (which had been a fascinating four months) seems to be 
imploding, and let a lot of us go. But the market is great... I've got a site interview with a Kleiner-backed 
tech-start up in a week, and interviews for a privacy engineer position with a major non-profit.

 

But ideas for where else to look always solicited gratefully!

 

Ross

 

 

 

 

 

 



-- 
Thomas Leavitt
Internet enabled since 1990

Archives | Modify Your Subscription | Unsubscribe Now        



-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915
Unsubscribe Now: 
https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-a538de84&post_id=20181202095644:7A62B5F4-F642-11E8-880E-95AD6110C767
Powered by Listbox: https://www.listbox.com

Current thread: