Apple lawyer Ted Olson says the tech heavyweight has good reason not to help federal investigators hack an iPhone that belonged to one of the San Bernardino shooters. The company, he says, “has to draw the line at re-creating code” and “changing” its product.
To which there is a two-word response: seat belts.
And here are two more: air bags.
In both those cases, the auto industry said federal officials had no right to make them tinker with their products. And in both cases, the feds prevailed, arguing that public safety is a more important consideration than automakers’ independence.
There are many other examples of government officials telling businesses to fiddle with their products in the name of safeguarding the public – baby cribs, toys, gasoline, mattresses, prescription drugs, aircraft and so on.
“Apple is arguing that the cellphone is a private space and that the user’s privacy would be infringed,” said Timothy Lytton, a law professor at Georgia State University who specializes in regulation of consumer products. “The government is saying, ‘No, it’s something you bring out into the world.’ ”
That’s the crux of the dispute: Is an iPhone or any other mobile device representative of your most intimate behavior, and thus protected by privacy laws, or is it a ubiquitous consumer product subject to the same oversight as other goods found in everyday life?
Apple says what the government is asking is overreach – a tricky thing to define.
The government can’t seek access to a terrorist’s iPhone but it can interfere with your personal freedom and make you wear a helmet when you ride a motorcycle? It can’t require Apple to write some new code but it can force cigarette companies to write on packages that their product may kill you?
Privacy is a big deal. And if that were the sole issue here, I don’t think anyone would say the government should have free rein to root around in your gadgets.
“The government is asking for a modification of a product that implicates an important right,” Lytton said. “Your phone is a private sphere of substance, just like your bedroom.”
However, the government isn’t talking about getting into everyone’s cellphones or everyone’s bedroom. It wants access to a single phone.
“The question is whether that one phone represents a substantial hazard to the public,” said Bill Kitzes, a product-safety expert and former program manager with the Consumer Product Safety Commission. “You could argue that it does.”
Privacy only becomes an issue, he says, if you don’t trust the government or Apple to keep the potential backdoor created for this single iPhone under wraps.
“If that software got out and anybody who had it could get into anyone’s phone, that could be a real problem.”
The government says in a court filing that its demand for Apple to help unlock the shooter’s phone “does not give the government ‘the power to reach into anyone’s device’ without a warrant or court authorization,” nor does it “compromise the security of personal information.”
It says the software the FBI wants Apple to write would remain under the company’s control.
“No one outside Apple would have access to the software required by the order unless Apple itself chose to share it,” the government says.
Can we trust Apple? Can we trust the government? I suspect many people would say no and no.
Distrust of the government is nothing new, and is arguably well deserved (yes, looking at you, NSA).
As for Apple, it’s worth noting that the company has a long track record of ratting out customers to the feds. In the first half of last year, according to the news site Quartz, Apple received 971 government requests for user data. It complied with a hefty 81 percent of those requests.
I don’t buy that Apple has seen the light on users’ privacy. I think the company is more concerned that if it caves here, what will happen when any other government – China’s, say – also demands security-related product changes?
That’s a reasonable worry. And it’s the company’s most convincing argument in favor of why the feds should back off.
Apple is on less sure ground when it argues, as Olson did, that the government can’t make it “re-create code” or “change the iPhone.”
It can, when public safety is a factor. Again, seat belts.
We need to protect what little privacy we have left. But in this case, and this case alone from a broader privacy perspective, I think the government is right.
Hack the damn phone, Apple.