Lots of people have some really strong opinions about the Apple debate. I'd like to suggest that like all arguments involving technology, if you don't understand the technology at a rudimentary level please stay out of the debate. Or better yet, educate yourself so you truly do understand what's going on and can speak intelligently. What this debate doesn't need is another talking head on Fox News or CNN who doesn't know the first thing about the technology involved.
This... so much this... |
What the FBI is essentially asking for is a manufacturer produced backdoor for the iPhone. I'll cover the technology issues with this specific request in a later post if I really think it is warranted. In this post, I really want to examine the precedent that the request sets for the rest of the computing industry.
What really happens when you patch your computer?
When you patch your computer, an update process on your computer reaches out on the Internet to download an update. Ideally, this update is downloaded over HTTPS. With the HTTPS download, best practices dictate that the update process will use certificate pinning to ensure that the update process is talking to the correct remote server. Without certificate pinning (or certificate validation) an attacker could use DNS spoofing and/or forged certificates to redirect an attacker controlled site.
When the update is downloaded, it is subsequently executed on the machine, almost always with elevated privileges (SYSTEM or root level). Ideally, this update package has been digitally signed with a signature that is known to the software package applying the update. If the update is just signed with any "validated" certificate, an attacker could compromise a CA and issue a "valid" certificate.
Attack vectors against updates
The most obvious solution here is to find a portion of the update process that is not protected using proper security controls. One example of this might be to compromise a software distribution site and replace the original with trojaned software. For insecure software upgrade processes, we already see the ISR Evilgrade tool being a useful proof of concept for the damage that can be done.
A more technologically advanced attacker could find a flaw in the way that updates are applied. We saw this done with the Flame malware, which hijacked Windows updates.
Attackers could steal valid certificates as was seen in Stuxnet. In Stuxnet, the attackers signed device drivers, but the same concept could be applied to software updates. Likewise, we see some upgrades that aren't signed. Engineers usually claim that these are still safe because the updates are distributed via HTTPS. This is a fallacy, attackers could much more easily steal an HTTPS certificate than a software signing cert (especially with the newer EV signing certs).
How does all of this apply to Apple/FBI?
If Apple builds a firmware backdoor for the iPhone and delivers it to the FBI, it will set a dangerous precedent. And the precedent goes far beyond mobile phones. Once the FBI understands that it can use the courts to force vendors to backdoor products, they will use this capability to impact the update process.
Picture this, the FBI is worried about a terror suspect. Through surveillance, they determine that the suspect uses a Windows laptop. But they can't get access to the laptop. In fact, they are having trouble identifying any specific selector for the laptop itself. For anonymity reasons, the terrorist only connects to the Internet on this laptop at his local neighborhood coffee shop. So there's that.
In the name of national security, the FBI goes to Microsoft with a court order. They need MS to engineer a backdoor that will allow them to read data on the terror suspect's computer. The FBI says that MS should build a malicious Windows update that packages a remote access trojan (RAT) on the suspect's machine. Because they don't know which machine, MS should apply this update to all machines that connect to the update service from the public IP address of the coffee shop.
This same game could play out with Ubuntu, Redhat, OSX, or you name it. Pretty much any operating system. If the development of that OS happens in the US (or a country that will honor a US court order) then the FBI can compel them to create malicious updates.
These update concerns don't just apply to operating systems, they apply to any software (including 3rd party software) that is running on the suspect's machine.
I'm not a criminal, I don't care
First off, are you sure? If the answer is yes, there's a great Twitter account you should follow that tweets out obscure federal crimes. Stuff you wouldn't believe. After reading this account, I figure that we are all federal felons for some obscure reason we are probably unaware of.
In the scenario I provided earlier, the FBI will compel Microsoft to install the updates on any laptop calling out from the coffee shop since they can't find a selector for the specific user. This is a serious issue since the FBI will now have backdoors on many machines rather than just one. They might delete these other RATs or they might look for evidence of other crimes, you know like law enforcement does. Of course then the scope of the original warrant might be an issue, but there's always parallel construction to the rescue.
Parting thoughts
Bottom line, this is a bad idea. It sets a horrible precedent and changes the customer/vendor trust model forever. Most of our enterprise trust models assume that vendors won't be backdoored, or at a very minimum that they will never exploit our machines directly. If the vendor can exploit our machines at will to give a third party access (court order aside), do you really own the machine? Do you really own the data?
If the FBI succeeds in this case, vendors are likely to flock away from the US to countries with less oppressive courts. Another possibility is an M of N signing strategies. In these strategies, the crypto key for signing updates would require a collusion of a majority of parties with portions of the keying material. A court in a single country could no longer compel a vendor to sign an update. Sure, this might be effective, but it will be costly and the costs will be passed on to you, the consumer.
There are other issues with these malicious updates falling into the wrong hands, targeting the wrong machines, etc. If there's enough interest (hit me up in comments, Twitter, etc.) I might write another post that addresses these concerns. Let me know if there's any interest in a follow up.
In the name of national security, the FBI goes to Microsoft with a court order. They need MS to engineer a backdoor that will allow them to read data on the terror suspect's computer. The FBI says that MS should build a malicious Windows update that packages a remote access trojan (RAT) on the suspect's machine. Because they don't know which machine, MS should apply this update to all machines that connect to the update service from the public IP address of the coffee shop.
This same game could play out with Ubuntu, Redhat, OSX, or you name it. Pretty much any operating system. If the development of that OS happens in the US (or a country that will honor a US court order) then the FBI can compel them to create malicious updates.
These update concerns don't just apply to operating systems, they apply to any software (including 3rd party software) that is running on the suspect's machine.
I'm not a criminal, I don't care
First off, are you sure? If the answer is yes, there's a great Twitter account you should follow that tweets out obscure federal crimes. Stuff you wouldn't believe. After reading this account, I figure that we are all federal felons for some obscure reason we are probably unaware of.
In the scenario I provided earlier, the FBI will compel Microsoft to install the updates on any laptop calling out from the coffee shop since they can't find a selector for the specific user. This is a serious issue since the FBI will now have backdoors on many machines rather than just one. They might delete these other RATs or they might look for evidence of other crimes, you know like law enforcement does. Of course then the scope of the original warrant might be an issue, but there's always parallel construction to the rescue.
Parting thoughts
Bottom line, this is a bad idea. It sets a horrible precedent and changes the customer/vendor trust model forever. Most of our enterprise trust models assume that vendors won't be backdoored, or at a very minimum that they will never exploit our machines directly. If the vendor can exploit our machines at will to give a third party access (court order aside), do you really own the machine? Do you really own the data?
If the FBI succeeds in this case, vendors are likely to flock away from the US to countries with less oppressive courts. Another possibility is an M of N signing strategies. In these strategies, the crypto key for signing updates would require a collusion of a majority of parties with portions of the keying material. A court in a single country could no longer compel a vendor to sign an update. Sure, this might be effective, but it will be costly and the costs will be passed on to you, the consumer.
There are other issues with these malicious updates falling into the wrong hands, targeting the wrong machines, etc. If there's enough interest (hit me up in comments, Twitter, etc.) I might write another post that addresses these concerns. Let me know if there's any interest in a follow up.
Thanks for writing! Very interested to read your further thoughts about this tech falling into the wrong hands: I'm seeing this issue being framed as whether or not we trust the government, which it seems is only part of the greater concern.
ReplyDeleteWhile I understand this order is very specific, it is that way because of the circumstances of physical access to the phone. If the FBI didn't have the phone, the order wouldn't be so narrow. Either way, it will set the precedent for the courts forcing companies to weaken their security to gain intelligence. Given that precedent, the MS scenario I play out is a logical step.
ReplyDeleteAre you really not worried about the precedent this will set if Apple is forced to comply?
ReplyDeleteThe iPhone in question was a company issued device. I can't say who is to blame (the admin or the executives) for not installing the necessary software to make this a moot point. In their incompetence, who, then, should act responsibly?
ReplyDeleteEven if I reject that argument, is there any doubt that complying with the court order will materially hurt Apple's business globally?
ReplyDeleteRejecting the argument is fair, I don't expect you to agree with my view, and I respect that you have yours.
ReplyDeleteI don't see it hurting Apple's business globally;
X percentage care about Apples potential invasion of their privacy
Y percentage dont.
Of X percentage, a subset would be using the 5c and if they really cared, they would be upgrading to the more secure versions of the phone (thus adding money to Apples bottom line)
Future versions of the phone I'm certain Apple will be modifying the phones/OS so that they will not be able to comply with the order as the users passcode would be paramount.
LEO has been able to break into Apple devices for a while (Im sure you've seen the SANS latest post) and all the while Apple has posted record profits. This includes Apple unlocking phones directly; it hasn't affected their bottom line at all.
Those that would leave because of this would come right back once Apple has updated their security so that they are no longer potential adversaries.
How can apple apply the update on the locked iphone without user approval?
ReplyDeleteAre there special security updates which can be silently applied?
If confirmed this is a backdoor itself.
"if you don't understand the technology at a rudimentary level please stay out of the debate. Or better yet, educate yourself so you truly do understand what's going on and can speak intelligently."
ReplyDeleteApple had better hope their attorney, Ted Olson, heeds this advice. His argument on "CBS This Morning" was unintelligible.
The notice messages posture genuine frenzy circumstance in the clients. One of the followed cautioning messages is, it has been discovered utilization of pilfered programming and obligated to pay $2000", and so forth.
ReplyDeletefbi moneypak
I've been using Kaspersky security for many years, I'd recommend this solution to all of you.
ReplyDelete