Don't Crack the iPhone: Apple Is Not the FBI's Tech Support
Don't Crack the iPhone: Apple Is Not the FBI's Tech Support
Apple is right to push back against government efforts to undermine iPhone security.
This week, the notoriously secretive Apple went up against the FBI when the agency requested that the company help break open an iPhone 5c used by the San Bernardino shooters. Instead, Tim Cook released a letter stating publicly that Apple believed creating a special tool to disable security features on iPhones would set a dangerous precedent. And he was right to do so.
Knock, Knock To be clear: The FBI is not specifically asking that Apple provide an always-accessible backdoor into everyone's iPhone. What it wants Apple to do is construct a special version of iOS that would bypass the time delay required by iOS between failed attempts and disable a setting that wipes iPhones after 10 failed attempts. This would allow agents to brute force—or try lots and lots of wrong passwords until stumbling across the right one—the phone's passcode. (It's been suggested that creating a super-long passcode might slow down that process.)
The argument that law enforcement and the intelligence community are responsible enough to handle powerful tools like these has been around for a long time. The FBI and others have talked about the riskof "going dark," where communications will be carried out via encrypted services that are inaccessible to investigators or surveillance tools.
The old adage is that a backdoor for the good guys is a backdoor for the bad guys; the safest way to keep people out is to not give them a way in. It's the argument that developer and CEO Nico Sell made when she was approached by FBI agents to put a backdoor in her secure messaging service, Wickr. But it's far from just a hypothetical argument.
Broken Locks Take TSA-compliant luggage locks. When you buy one from the store, it's designed to accept one of several possible master keys in the hands of TSA agents. The idea is that this allows the right people—the TSA inspectors—to open your luggage without having to cut off locks and then safely lock the baggage again. Only you and the inspectors should be able to open it.
It's a nice idea, but it only works as long as sole access to the master keys is restricted to the right people. The good guys. But these keys were posted online and made into 3D printable objects, providing access to everyone: good guys andbad guys.
It's this scenario that Apple cites as its primary reason for fighting the FBI's court order. If Apple created a special version of iOS and used it to allow the phone in question to be unlocked, it might not stay under the company's control for very long. If it got loose, it could undermine the hard work Apple has put in developing a smart, secure phone. If it exists at all, Apple could be compelled to use it again, and again, and again.
To be fair, Apple already spends a good deal of time and effort responding to court orders and investigators' requests. The New York Times reports that the company handed over the shooter's iPhone backup files that were stored on iCloud. When PCMag recently looked closer at encryption and how Apple stores our information, we found that as long as it is stored on Apple's servers it is potentially readable. But Apple is making it clear that it is only willing to go so far, and developing custom intrusion tools for the FBI is apparently the limit.
Security for All As our devices become more and more personal, it is no surprise that they'll be targeted by law enforcement and intelligence agencies. But that's no excuse for the FBI, or anyone else, to weaken existing security tools and claim that digital privacy is the exclusive realm of those in power. Which is, effectively, what it's doing.
Back in 2014, FBI director James B. Comey addressed the crowd at the RSA Conference. When it came to surveillance and searches, particularly of the digital kind, he said "Our goal is to be surgical and precise in what we're looking for, and do whatever we can to protect privacy rights and competitive advantage." Building a magic key for iPhones, or preventing the widespread use of encryption, would do neither.
If the FBI wants to get into an iPhone, or any other secure device, it can develop the technology themselves. Security experts are often telling me that if someone wants to break into a phone and has physical access to it, they will eventually succeed. I'm confident that if the FBI rolled up its sleeves, it would get what it's looking for. If accused murderer and bath salts enthusiast John McAfee thinks he can pull off cracking an iPhone, surely the FBI can, too.
Max Eddy is a Software Analyst, taking a critical eye to Android apps and security services. He's also PC Mag's foremost authority on weather stations and digital scrapbooking software. When not polishing his tinfoil hat or plumbing the depths of the Dark Web, he can be found working to discern the 100 Best Android Apps. Prior to PCMag, Max wrote for the International Digital Times, The International Science Times, and The Mary Sue. He has also been known to write for Geek.com. You can follow him... MORE »
No comments:
Post a Comment
Please leave a comment-- or suggestions, particularly of topics and places you'd like to see covered