Month: February 2016

Thoughts on Apple’s Response to Court Order

Apple filed their response to the court order seeking them to create a new operating system with security features removed. Since I did a post on the DOJ’s motion, I thought I’d also do one with my thoughts as I read the Apple response.

Page 1 line 3: This is not a case about one isolated iPhone.

Nobody believes this. Even FBI Director James Comey, who has made that argument, had to come out and admit that the case “will be instructive for other courts,” and that the outcome would affect other cases.

Page 1 lines 4-5: this case is about the Department of Justice and the FBI seeking through the courts a dangerous power that Congress and the American people have withheld.
Page 1 lines 13-15: In fact, no court has ever authorized what the government now seeks, no law supports such unlimited and sweeping use of the judicial process, and the Constitution forbids it.
Page 1 lines 16-17: Since the dawn of the computer age, there have been malicious people dedicated to breaching security and stealing stored personal information.
Page 1 lines 21-22: In the face of this daily siege, Apple is dedicated to enhancing the security of its devices

Apple sure does seem to be using a lot of grandiose language. I feel like I’m getting ready to watch Star Wars or something, or at least something a more exciting than a boring, old legal brief.

Page 2 lines 7-8: There are two important and legitimate interests in this case: the needs of law enforcement and the privacy and personal safety interests of the public.

I think it’s important to frame this debate as being law enforcement vs security, instead of being a case of security vs. privacy. If the DOJ is able to win this case (and the sure-to-follow appeals), then it will lead to a reduction in security as backdoors would then be something that the government could compel organizations to create.

Note: I wanted to link to a really good blog post on whether this is a “backdoor” or not, as people have been latching onto that word. I can’t remember who it was and just wasted too much time trying to find it, but basically the guy said that if you have Program A which removes security features so that a formerly-secure product could be accessed. That would be considered a backdoor. You could also have Program B which only allows Program A to run in certain situations. Program B may not be a backdoor, but it relies on Program A, which could be used on any device. It made a lot more sense than that when he wrote it, and I really wish I could find it again.

Page 2 lines 26-27: And once developed for our government, it is only a matter of time before foreign governments demand the same tool.

This is just ignored by all the “It’s just one phone!” folks. Our government has decent civil liberties protections for citizens. Other countries don’t, though. If the DOJ wins this case then it opens up other countries to then expect the same assistance, and Apple would face tremendous pressure to comply, and that process would almost certainly be used on political dissidents in authoritarian countries.

Page 3 line 18 – page 4 line 2: even if such limitations could be imposed, it would only drive our adversaries further underground, using encryption technology made by foreign companies that cannot be conscripted into U.S. government service—leaving law-abiding individuals shouldering all of the burdens on liberty, without any offsetting benefit to public safety.

This is why this needs to be framed as a decision pitting less security vs. more security. Encryption is out there. It’s not going away no matter what the U.S. government wants (see the worldwide encryption products survey by Bruce Schneier and others).

Page 4 lines 6-12: Finally, given the government’s boundless interpretation of the All Writs Act, it is hard to conceive of any limits on the orders the government could obtain in the future. For example, if Apple can be forced to write code in this case to bypass security features and create new accessibility, what is to stop the government from demanding that Apple write code to turn on the microphone in aid of government surveillance, activate the video camera, surreptitiously record conversations, or turn on location services to track the phone’s user? Nothing.

This is a bit of a sky-is-falling argument, but it wouldn’t surprise me if law enforcement really did want those capabilities.

Page 4 lines 13-17: As FBI Director James Comey expressly recognized:

Democracies resolve such tensions through robust debate. . . . It may be that, as a people, we decide the benefits [of strong encryption] outweigh the costs and that there is no sensible, technically feasible way to optimize privacy and safety in this particular context, or that public safety folks will be able to do their job well enough in the world of universal strong encryption. Those are decisions Americans should make, but I think part of my job is [to] make sure the debate is informed by a reasonable understanding of the costs.

Nice use of Director Comey’s words to make their point right there. It seems to be a pattern where he says something and then later says the exact opposite thing (see the 1 phone vs precedence item above).

Page 6 lines 15-17: For one, Apple uses a “large iteration count” to slow attempts to access an iPhone, ensuring that it would take years to try all combinations of a six- character alphanumeric passcode.

This points out that the best thing you can do is to change the setting away from the default number-only passcode and make it an alphanumeric passcode. As long as that alphanumeric passcode isn’t something obviously gussable, then your phone wouldn’t be able to be opened, even with the “FBiOS”.

Page 8 lines 10-17: In addressing the twin needs of law enforcement and privacy, Congress, through CALEA, specified when a company has an obligation to assist the government with decryption of communications, and made clear that a company has no obligation to do so where, as here, the company does not retain a copy of the decryption key. 47 U.S.C. § 1002(b)(3). Congress, keenly aware of and focusing on the specific area of dispute here, thus opted not to provide authority to compel companies like Apple to assist law enforcement with respect to data stored on a smartphone they designed and manufactured.

This seems to me like a pretty good argument. My understanding is that the All Writs Act is for situations where the law is silent. In this case, the law isn’t silent. It specifically says that Apple does not have an obligation to assist law enforcement.

Page 9 lines 12-14: Moreover, members of Congress have recently introduced three pieces of legislation that would affirmatively prohibit the government from forcing private companies like Apple to compromise data security.

To be fair, other members of Congress have proposed legislation that would require companies like Apple to compromise data security.

Page 11 footnote 21: In its motion to compel, filed February 19 with this Court, the government sought to shift the blame to the “owner” (San Bernardino County) in describing who changed the password and why it allegedly has no other viable alternatives besides the creation of a new operating system. Dkt. 1 at 18 n.7. The FBI later issued a press release acknowledging that it “worked with” the County to reset the password.

Nice little dig at the FBI. Yet another example of law enforcement being quite duplicitious.

Page 11 footnote 22a: The government obtained the Order without notice to Apple and without allowing Apple an opportunity to be heard. See Mullane v. Cent. Hanover Bank & Tr. Co., 339 U.S. 306, 314 (1950) (recognizing that one of the “‘fundamental requisite[s] of due process of law is the opportunity to be heard’”) (quoting Grannis v. Ordean, 234 U.S. 385, 394 (1914)).

The Order also made it quite clear that Apple could file a motion challenging the validity of the order, so I think they’re pushing their luck here trying to argue they weren’t given the “opportunity to be heard.” That’s probably why it’s in a footnote, though.

Page 11 footnote 22b: But this was not a case where the government needed to proceed in secret to safeguard its investigation; indeed, Apple understands that the government alerted reporters before filing its ex parte application, and then, immediately after it was signed and confirmed to be on the docket, distributed the application and Order to the public at about the same time it notified Apple.

That’s because it’s not a case about getting information from this one phone. There’s been a lot written about this already, but the phone isn’t even likely to have much information, since the terrorist destroyed his other phones but didn’t care enough about this one to destroy it.

Page 11 footnote 22c: Moreover, this is the only case in counsel’s memory in which an FBI Director has blogged in real-time about pending litigation, suggesting that the government does not believe the data on the phone will yield critical evidence about other suspects.

The blog post was a p.r. effort. Lending even more credence to the argument that, for the FBI, this is all about setting a precedence.

Page 13 line 27 – page 14 line 6: Thus, quality assurance and security testing would require that the new operating system be tested on multiple devices and validated before being deployed. Apple would have to undertake additional testing efforts to confirm and validate that running this newly developed operating system to bypass the device’s security features will not inadvertently destroy or alter any user data. To the extent problems are identified (which is almost always the case), solutions would need to be developed and re-coded, and testing would begin anew.

An example of why anybody who says “It’s only one phone” either has absolutely no grasp of how software development works or else is just lying.

Page 14 lines 14 – 24: The All Writs Act (or the “Act”) does not provide the judiciary with the boundless and unbridled power the government asks this Court to exercise. The Act is intended to enable the federal courts to fill in gaps in the law so they can exercise the authority they already possess by virtue of the express powers granted to them by the Constitution and Congress; it does not grant the courts free-wheeling authority to change the substantive law, resolve policy disputes, or exercise new powers that Congress has not afforded them. Accordingly, the Ninth Circuit has squarely rejected the notion that “the district court has such wide-ranging inherent powers that it can impose a duty on a private party when Congress has failed to impose one. To so rule would be to usurp the legislative function and to improperly extend the limited federal court jurisdiction.” Plum Creek, 608 F.2d at 1290 (emphasis added).

This seems to be one of their best arguments, especially since it relies on precedent from a prior Ninth Circuit case instead of arguing that this would be bad policy.

Page 16 lines 1-5: Thus, in another pending case in which the government seeks to compel Apple to assist in obtaining information from a drug dealer’s iPhone, Magistrate Judge Orenstein issued an order stating that while the Act may be appropriately invoked “to fill in a statutory gap that Congress has failed to consider,” it cannot be used to grant the government authority “Congress chose not to confer.”

It’s important to note that Apple has already had success in arguing against the use of the All Writs Act to compel them to decrypt a phone. That was in New York, though, so it doesn’t have an precedental value in this case.

Page 17 lines 2-5: CALEA does not allow a law enforcement agency to require Apple to implement any specific design of its equipment, facilities, services or system configuration. Yet, that is precisely what the government seeks here. Thus, CALEA’s restrictions are directly on point.

I can’t get past this argument. My uderstanding is the All Writs Act is for situations where there is no clear law. It doesn’t seem like if there’s a law that law enforcement doesn’t like they should be able to use the All Writs Act to get around that law.

Page 24 footnote 24: The government’s suggestion that Apple can destroy the software has clearly not been thought through, given that it would jeopardize criminal cases.

I love it when subtle insults are put into footnotes. I don’t know why I take so much pleasure from that, but I do.

Page 26 lines 12-16: Indeed, under the government’s formulation, any party whose assistance is deemed “necessary” by the government falls within the ambit of the All Writs Act and can be compelled to do anything the government needs to effectuate a lawful court order. While these sweeping powers might be nice to have from the government’s perspective, they simply are not authorized by law and would violate the Constitution.

I made the same point in my blog on the DOJ’s filing in this case, saying that, “It seems like the government’s reasoning would lead to the situation where anybody with any specialized skills would be required to assist in serving a warrant.”

Page 30 lines 1-4: Moreover, the government has not made any showing that it sought or received technical assistance from other federal agencies with expertise in digital forensics, which assistance might obviate the need to conscript Apple to create the back door it now seeks.

I don’t think the FBI wants to say publicly that the NSA can’t get into the phone. Of course, that assumes that the FBI’s purpose in this case is to gain access to the phone. I’ve said multiple times that its purpose is to set a precedent that companies need to break their encryption. I guess if you look at it that way, Apple is the only company that can serve that purpose in this case.

Problems with FBI’s Effort Against Apple

The Department of Justice filed a motion in response to Apple’s announcement that they would not be complying with the court order to create a backdoor that would allow access to the San Bernardino shooter’s cellphone. It is not toally surprising that they would do so since the FBI is part of the DOJ. I’m a little surprised by the timing. I would have thought that it would have made more sense for the DOJ to file this motion after Apple made their official response to the court. But, I am not a lawyer, so maybe there is some tactical reason why the government would want to file this first.

Page 1 lines 3-5: Rather than assist the effort to fully investigate a deadly terrorist attack by obeying this Court’s Order of February 16, 2016, Apple has responded by publicly repudiating that Order.

This is interesting, at least in light of the New York Times article claiming that Apple wanted to have the original order be sealed, and the government was the one who made it public, and only after that did Tim Cook draft and release his Letter to Customers.

Page 1 lines 9-11: Despite its efforts, Apple nonetheless retains the technical ability to comply with the Order, and so should be required to obey it.

This seems to be true. I haven’t seen anybody claiming that Apple doesn’t have the technical capabilities to remove the security on the phone. Of course, that being true doesn’t lead to the conclusion that they should have to use those technical capabilities in this case.

Page 2 lines 2-7: The Order requires Apple to assist the FBI with respect to this single iPhone used by Farook by providing the FBI with the opportunity to determine the passcode. The Order does not, as Apple’s public statement alleges, require Apple to create or provide a “back door” to every iPhone…

This is an example of the straw man logical fallacy (incidentally, the previous quote would also be an example of this fallacy). The argument by privacy advocates, Apple, and technologists isn’t that this particular operating system would lead to a backdoor for every iPhone. The argument is that the legal precedence of this order would then lead to the government being able to compel similar access in the future.

This section continues that the order:

Page 2 lines 9-11: does not give the government “the power to reach into anyone’s device” without a warrant or court authorization;

This takes the straw man argument and makes it blatantly obvious. Notice how the motion stops after the word “device”? There’s a good reason for that. That’s because Apple doesn’t make the argument that they would be giving the government access without court authorization. That would make absolutely no sense because the whole argument is over a court order. The question is whether the court order should be legally enforceable, and whether enforcing the court order would be good policy for the United States.

Page 2 lines 16-19: In the past, Apple has consistently complied with a significant number of orders issued pursuant to the All Writs Act to facilitate the execution of search warrants on Apple devices running earlier versions of iOS.

Just because a citizen or corporation has voluntarily assisted law enforcement in the past, or at least not refused to assist law enforcement, does not mean that they are compelled to do the same in the future. Using this logic, if a suspect in a criminal investigation cooperates with the police, then he would be waiving his right to then refuse to cooperate in the future if he changes his mind. This is clearly not the way our justice system works.

Page 2 line 22 – page 8 line 2: Apple’s current refusal to comply with the Court’s Order, despite the technical feasibility of doing so, instead appears to be based on its concern for its business model and public brand marketing strategy.

Of course Apple is concerned about its business model and marketing. That doesn’t mean that is the only argument, or even the main argument, against compelling Apple to remove the security on an iPhone, and it certainly doesn’t prove that Apple is in the wrong here.

Page 3 lines 11-14: the urgency of this investigation requires this motion now that Apple has made its intention not to comply patently clear. This aspect of the investigation into the December 2, 2015 terrorist attack must move forward.

That answered my question of why the DOJ decided to make this motion now. I think it may be a little wishful thinking on the part of the government, though. I think most observers expect this to be litigated for quite a while, likely to the Supreme Court, since both sides have so much to gain and so much to lose if a ruling goes against them.

Page 12 lines 7-12: In Mountain Bell, the Ninth Circuit emphasized that its decision “should not be read to authorize the wholesale imposition upon private, third parties of duties pursuant to search warrants,” 616 F.2d at 1132, but Apple is not a random entity summoned off the street to offer assistance, nor is it the target of the investigation.

It seems to me like the government’s argument here would lead to almost anybody being compelled to have to assist a warrant. Law enforcement wouldn’t be ordering just anybody to assist, they would be wanting assistance from people who have particular skills or knowledge, or some other applicable quality that law enforcement would want to make use of. It seems like the government’s reasoning would lead to the situation where anybody with any specialized skills would be required to assist in serving a warrant, which is a lot more broad than the quoted Ninth Circuit opinion seems to imply.

Page 14 lines 3-10: assistance under the All Writs Act has been compelled to provide something that did not previously exist – the decryption of the contents of devices seized pursuant to a search warrant. In United States v. Fricosu, 841 7 F.Supp.2d 1232, 1237 (D. Co. 2012), a defendant’s computer -whose contents were encrypted – was seized, and the defendant was ordered pursuant to the All Writs Act to assist the government in producing a copy of the unencrypted contents of the computer.

This doesn’t seem like a very apt comparison. If Apple was being required to provide a password or some kind of knowledge to be used to decrypt the iPhone, then that would be similar to decrypting the defendant’s computer. In this case, Apple is being asked to write a new firmware update. That’s not the same thing as putting your password into a computer. And, it’s not exactly settled case law that a person can be compelled to give their password to law enforcement, anyway.

Page 14 line 27 – page 15 line 10: the Order is tailored for and limited to this particular phone…Nor is compliance with the Order a threat to other users of Apple products. Apple may maintain custody of the software, destroy it after its purpose under the Order has been served, refuse to disseminate it outside of Apple, and make clear to the world that it does not apply to other devices or users without lawful court orders. As such, compliance with the Order presents no danger for any other phone and is not “the equivalent of a master key, capable of opening hundreds of millions of locks.”

Again, the concern isn’t that this particular source code will be let out into the wild. The concern is that once the government sets this precedence, then Apple will have to comply with similar orders in the future. The government didn’t pick this case to make their public stand because they need access to this phone in order to stop an impending attack. They chose this case because the publicity of a terrorist using an encrypted device works to further their effort to weaken encryption in a misguided attempt to battle the so-called “going dark problem.”

Page 20 lines 20-25: no one outside Apple would have access to the software required by the Order unless Apple itself chose to share it. This eliminates any danger that the software required by the Order would go into the “wrong hands” and lead to criminals’ and bad actors’ “potential to unlock any iPhone in someone’s physical possession.”

I’m glad the government has complete faith in Apple’s ability to keep something a secret. Do they have that same amount of faith in the Office of Personnel Management? Or in JP Morgan Chase? Or maybe Adobe, which had their source code stolen by hackers?

Page 20 line 26 – page 21 line 1: marketing or general policy concerns are not legally cognizable objections to the Order. As discussed above, the analysis of whether a court order presents an unreasonable burden is focused on the direct costs of compliance

I don’t know enough about the law to know if this is true, but it seems like it might be. It would be a lot harder to quantify potential indirect costs, such as that American tech companies may not be trusted in the global marketplace. Unfortunately, those indirect costs are worth a lot more than the direct costs to Apple of having some software engineers write some code.

Page 21 lines 9-10: Strong public policy interests favor enforcing the All Writs Act Order in this matter.

I’ll close with this, because it is obviously in debate. I think it’s obvious that I come down on the infosec community’s side, the technology community’s side, the side of privacy, and the side of security. In other words, I hope Apple wins this case. I don’t hope that because I support terrorists, as Apple has been accused of doing. I hope it mainly because the problem of precedence and the damage to our technology industry such a ruling would cause are humongous.

Logical Fallacies in NSA Director’s Interview

National Security Agency Director Adm. Michael Rogers gave an interview with Yahoo News’s Michael Isikoff. The interview is full of misleading statements and poor logic.

Rogers confirmed speculation that began right after the attack: that “some of the communications” of the Paris terrorists “were encrypted,” and, as a result, “we did not generate the insights ahead of time.”

According to this logic, if any terrorist uses any encryption, then law enforcement will not be able to find anything out about them. This is demonstrably false. The former director of the NSA, Gen. Michael Hayden, has said that, “We kill people based on metadata.” Metadata still exists even if communications are encrypted. Former NSA General Counsel Stewart Baker said that metadata can tell you, “everything about somebody’s life,” and that, “If you have enough metadata, you don’t really need content.”

Adm. Rogers goes on from that last point:

“Clearly, had we known, Paris would not have happened.”

This is an example of the logical fallacy of circular logic. Circular logic is when “the reasoner begins with what they are trying to end with.” Adm. Rogers assumes that if he had enough information then he would have been able to predict, and therefore stop, the attacks. According to his logic, there would never be the situation where law enforcement had information but failed to connect the dots and turn the information into actionable intelligence. Something like, say, an arrested Al Quaeda operative being “described as interested in flight training for the purpose of using an airplane in a terrorist act” before 9/11 happened.

From a little later in the article:

Rogers has at times sought to steer a middle ground in this debate, acknowledging that encryption is “foundational to our future” and even saying recently that arguing about it “is a waste of time.”

Encryption is “foundational to our future” (I would add foundational to our present society) and arguing about it is a waste of time.

Because Math Mug

Adm. Rogers doesn’t stop there, though, and goes on to argue about encryption.

He frankly acknowledged, “I don’t know the answer” to unencrypting devices and applications without addressing the concerns over privacy and competitiveness, calling for a national collaboration among industry and government officials to solve the problem.

This is an example of the fallacy of argument to moderation. This fallacy says that in a choice between two extremes, the correct choice will fall somewhere in the middle. In this case, perfect encryption and total law enforcement access to all data would both be ruled out and the correct solutioon to the “going dark problem” would be somewhere in the middle. Once again, though, that ignores that it’s not an argument between two possible policy decisions. It’s an argument between having the most robust security you can or purposefully weakening security.

Once again:

Because Math Shirt

Rogers also provided new details about his agency’s efforts to implement the USA Freedom Act, a law passed in the wake of the Edward Snowden disclosures, which he said has made it “more expensive” for his agency to access the phone records of terror suspects inside the United States and has resulted in a “slightly slower” retrieval of data from U.S. phone companies.

That was kind of the point of the law.

But Rogers said the delay in retrieving phone records is measured “in hours, not days or weeks,” and he has not yet seen any “significant” problems that have “led to concerns … this is not going to work.”

Wow, just a few months ago, he was saying the exact opposite when he was lobbying against the bill, saying that “Americans will become less safe” when the Freedom Act goes into effect.

That kind of makes you wonder how much faith we should put into what Adm. Rogers says. As Ars Technica points out in their article on this Rogers interview, the Paris attacks should also cause us to question the value of what Adm. Rogers says.

ISIS has been known to use encrypted communications, such as Telegram, to communicate and recruit. But despite those encrypted communications, the US did provide a warning of an impending ISIS attack in France, despite any encryption, over two months before the attack.

So, there was some forewarning that ISIS would attempt to attack France, despite the use of encryption by some terrorists. And, even more damaging to Adm. Rogers idea that “had we only known” we would have been able to stop the attacks is that the French did receive specific warnings about one of the attackers. Turkish police said they notified the French about him multiple times about him, but that, “We did, however, not hear back from France on the matter.”

NERC Committee Agenda Packages

The NERC Board of Trustees is meeting this week, and along with that are several standing committee meetings. While the meetings will not be simulcast online, the agendas for the meetings oftentimes include some interesting reading. A couple items from the Members Representative Committee and the Compliance Committee were interesting.

Members Representative Committee

On page 44 of the MRC agenda package, Compliance Guidance Implementation is discussed. They provide an update on the new process where compliance guidance will be vetted by the “ERO Enterprise,” and after that vetting and approval then the guidance will be given “deference” from auditors in all the regions. This update says that the task force is beginning to review existing documents that can be submitted to the ERO Enterprise for endorsment. More interesting, though, is that the CCC members of the task force are developing a process to approve organizations to be able to submit guidance documents even if the organization is not already on the pre-qualified list to submit guidance documents.

Another interesting bit of news is that a CMEP Practice Guide focused on what it means for auditors to “provide deference” is being developed. The guide on how to provide deference will be the first CMEP Practice Guide published. The CMEP Practice Guides are basically guidance created by the ERO Enterprise which provide direction to auditors on how they should conduct audits.

Compliance Committee

Lessons Learned Documents

This one had this great quote: “The CIP Version 5 Transition Advisory Group identified specific issues with the CIP Version 5 standard language, which were temporarily resolved through Lessons Learned documents.”

It then lists the issues that are being referred to the CIP V5 Revisions Standards Drafting Team:

  • Transmission Owner Control Centers
  • BES Cyber Assets/Programmable Electronic Devices
  • Virtualization
  • External Routable Connectivity

InegoMontoyaMemeTo call these issues even temporarily “resolved” is quite the stretch. Virtualization is not addressed at all in the Lessons Learned (LL) documents. While the others were addressed, they were not resolved. For example, the LL on BES Cyber Assets doesn’t provide a definition for “programmable,” which forms the basis of the Cyber Asset definition but doesn’t have any clear definition itself.

Note: A coworker told me that she’s heard the same line (“temporarily resolved through LL documents”) used by the V5 TAG several times. This is the first I’ve noticed it, though.

IRAs and ICEs

The 2015 ERO Enterprise Annual CMEP Report was included in the agenda package. It said that there were 236 entities scheduled for an audit in 2015. The Regions conducted a total of 230 Inherent Risk Assessments for entities on the audit schedule, so they got almost all of them. They also performed 31 Internal Controls Evaluations for entities on the audit schedule, or about 13% of the entities that had an IRA performed. It would have been nice to have a breakdown of those numbers by region. Are all eight regions represented in those 31 ICEs? Or are the numbers dominated by just one or two regions? That would be helpful information to have, although it may be available from other sources, I haven’t researched that question.

Outreach Events Focused on Risk-based CMEP

Screen Shot 2016-02-08 at 11.26.54 AM.pngThese figures were included in the CMEP report. It seems weird that ReliabilityFirst would have done almost twice as many events as anybody else, but had the second lowest number of participants. These numbers would mean that RF only had about 9 participants per event, which seems quite low. It makes me wonder if the different regions didn’t all use a standardized definition of what constitutes an “outreach event.”