Wednesday, March 30, 2016

FBI vs Apple: A Postmortem

By now you’ve doubtless heard that the FBI has broken the encryption on Syed Farook — the suicide terrorist who killed fourteen and then himself in San Bernardino. Consequently, they won’t be requiring Apple’s (compelled) services any more.

A number of people have written in and asked what we knew about the hack, and the frank answer is “not a heck of a lot”. And it’s not just us, because the FBI has classified the technique. What we do know is that they paid Cellebrite, an Israeli security firm, at least $218,004.85 to get the job done for them. Why would we want to know more? Because, broadly, it matters a lot if it was a hardware attack or a software attack.

Software or Hardware?

If the attack was hardware, it may not be such a big deal. The iPhones supposedly prevent a brute-force (guessing) attack against the password by wiping memory or delaying after a fixed number of wrong guesses. The basic idea behind a possible hardware attack is to dump the memory from an NAND flash chip on board, try a few passwords, and then re-flash the memory to the initial state before tripping the security. Another possibility, if there’s a timeout on password guesses, is to associate the phone with a fake cell tower, and push new times to the phone every time they get locked out. Delays are meaningless if you can arbitrarily set the time on the phone.

The hardware attacks, if these are they, aren’t a big deal because they require physical control of the phone, potentially for a long time. This isn’t something that a criminal gang is going to use to steal your bank account data, but something that governments can do in limited situations, legally, and with warrants. In contrast, an unknown flaw in the OS’s security model could be remotely exploitable, and would likely work on any phone in a lot shorter time. If the flaw became known to criminal gangs before Apple, millions of Americans with iPhones will be at risk.

Responsible Disclosure?

If the FBI is sitting on an OS flaw, and it is one that’s in principle exploitable by criminals, they owe it to their constituency — US citizens — to disclose that information to Apple so that it can get fixed. But because the FBI has classified the hack, they’re not going to be compelled to tell anyone how they did it.

It’s certainly the case that if we had hacked this phone, we’d be subject to charges under the DMCA or worse. And we’d certainly be under a moral, if not legal, obligation to inform Apple so that they could fix things. We hope that this means that the crack was hardware based. It’s worth mentioning that what the FBI was demanding from Apple was a software attack — this may be further evidence that they don’t have one.

So the Farook case is over, which means we can all rest assured that our phones are safe, right? (Or at least they’re safe from anyone who hasn’t hired Cellebrite.) After all, the FBI director publicly stated that this was just about unlocking only a single (terrorist’s) phone, and not about setting a precedent, so they’ll stop trying to force firms to break their own encryption, right?

We don’t believe that for a second. The Farook case was intended to capitalize on the public’s fear of terrorism to force Apple to play along and take actions that harm all of their customers. The FBI will be trying to establish precedent to compel decryption again, and will try until they find a judge to agree with them.

Sounds like a conspiracy theory? Don’t listen to some crackpot writer for a niche tech website. Richard Clarke, former national security advisor and head of counter terrorism weighed in on the subject:

“[The FBI] is not as interested in solving the problem as they are in getting a legal precedent,” Clarke said. “Every expert I know believes the NSA could crack this phone. They want the precedent that government could compel a device manufacturer to let the government in.”

“The FBI director is exaggerating the need for this, trying to build it up as an emotional case … It’s Jim Comey. And the Attorney General is letting him get away with it.”

What Clarke said is consistent with our crackpot conspiracy theories. The FBI has been systematically trying to compel firms to backdoor their own encryption. If they were interested in just one phone, they’d pay an Israeli security firm $200,000 to get the job done. (We have no inside information about if or why the NSA wouldn’t play along.)

The FBI has been after Apple since they announced that they were expanding encryption coverage. Read this headline from December 2014. Does that sound familiar? It’s exactly the same legal argument they used in the Farook case. Only the FBI got shut down instead of hiring an outside hacking firm. That didn’t stop the FBI from telling Apple employees that they would be killing children by enabling encryption on their phones.

You don’t need to look very far into the future to find the FBI’s next test case, either. Indeed, there are currently at least a dozen open cases at the moment, all justified under the All Writs Act. It’s hard to believe Director Comey’s argument that Farook was about a single phone.

(As we were writing this article, the Justice Department essentially declared victory in Farook, and now seems to say that it will use the Farook result as precedent. That was fast!)

Which Side Are They On?

There is a real problem at both the NSA and the FBI at the moment. They’re tasked with getting information on potential terrorists and prosecuting crimes, while at the same time protecting American citizens’ data and property. In particular, the NSA helps develop civilian cryptography, and the FBI is responsible for interstate Internet fraud. In cases like this, the same agencies have both an interest in the public’s benefit from strong encryption but also the desire to decrypt individual’s phones as evidence. They’re required to be schizophrenic. One can only hope that they’re balancing the conflicting demands appropriately.

If the Farook case has shown us anything, it’s that the FBI is behaving as if they value their offensive mandate more heavily than their defensive one — even though it weakens the security of US citizens with legitimate interests in keeping their confidential information safe.

The FBI testified that only Apple could unlock the phone while seeking an outside firm to unlock the phone. Indeed, it was cracked just over a month after this testimony. They picked an emotionally charged case and touted it heavily in the public press, something that they don’t do with their other cases — most notably those where the judges decide against their interpretation of the All Writs Act. They’re asking for a software-based attack, which is something with far-reaching consequences (and dangers if it falls into the wrong hands). And finally, they’ve relied on misleading and hyperbolic testimony to push the issue. In short, they’re playing dirty pool and stretching the truth, which is what one expects of the prosecution.

This would be uncontroversial if they’re weren’t also tasked with protecting the interests of American citizens.


Filed under: iphone hacks, news, security hacks

from iphone hacks – Hackaday

Monday, March 7, 2016

Bullet-time Video Effect by Throwing Your Phone Around

Ski areas are setting formal policies for drones left and right, but what happens when your drone isn’t a drone but is instead a tethered iPhone with wings swinging around you like a ball-and-chain flail as you careen down a mountain? [nicvuignier] decided to explore the possibility of capturing bullet-time video of his ski runs by essentially swinging his phone around him on a tether. The phone is attached to a winged carrier of his own design, 3D printed in PLA.

One would think this would likely result in all kinds of disaster, but we haven’t seen the outtakes yet, and the making-of video has an interesting perspective on each of the challenges he encountered in perfecting the carrier, ranging from keeping it stable and upright, to reducing the motion sickness with the spinning perspective, and keeping it durable enough to withstand the harsh environment and protect the phone.

He has open sourced the design, which works for either iPhone or GoPro models, or it is available for preorder if you are worried about catastrophic delamination of your 3D printed model resulting in much more bullet-like projectile motion.

Thank you [Remeton] for pointing us to this nausea-inducing (ish) hack.


Filed under: iphone hacks, video hacks

from iphone hacks – Hackaday