Make no mistake – the suddenly-white hot debate over whether or not Apple will create a means for the FBI to “unlock” one of its cell phones is a defining moment in the rollout of the 21st century’s mobile, connected world.
This Silicon Valley-Washington D.C. face-off raises issues of privacy and national security, of freedom of speech, and even foreign policy considerations with respect to repressive regimes and those governments hoping to track journalists’ sources.
And lest we forget, Apple’s stance flows from a long-held, business-based decision to protect its brand with customers who prize the data protection built into iPhones. In a New York legal dispute with prosecutors last year, The Daily Beast reported Wednesday, the company said, “forcing Apple to extract data… absent clear legal authority to do so, could threaten the trust between Apple and its customers and substantially tarnish the Apple brand.”
On Monday, a federal magistrate – in what is said to be the first such order of its kind – told Apple to create a new technological method that would allow government officials to override login safeguards built into its latest phones. One such method is to reconfigure a phone to eliminate a feature where a relatively small number of unsuccessful attempts – nine or ten – will result in data being erased. If eliminated, so-called “brute force” efforts can find the right password using super computers to run through all potential combinations.
FBI director James Comey told Congress last week the bureau has been locked out of – and presumably stopped short of the maximum number of attempts to unlock – one of the of the shooter’s phones, in attempting to gather evidence about the December mass shooting in San Bernardino, Calif. The federal magistrate’s demand is based in the 1789 “All Writs Act,” which permits to courts to issue orders in matters outside specific statutes – and Apple likely will appeal the order on the grounds it goes too far beyond current law.
In a “letter to customers” posted late Tuesday, Apple CEO Tim Cook wrote that “the United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. … the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”
Calling the request “chilling,” Cook said “In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The Apple CEO defended his company, saying it has “no sympathy for terrorists,” and that the company has turned over requested data when asked and made Apple engineers available to offer “our best ideas on a number of investigative options at their disposal.”
Cook acknowledged the government considers the new version of the operating system to be a “one-time use,” but said such smart phones have become the repositories of “incredible amount of personal information …. our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.
Once there is a “key” to gain access to such data, Cook said, “the technique could be used over and over again, on any number of devices… the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”
Critics of giving government such methods to decipher communication – or of even creating such keys for company use if compelled by a court order – say it will also pave the way for skilled terrorists to undermine Web security, potentially allow repressive regimes to track down dissidents, and thwart press attempts to uncover corruption and human rights violations. Ironically, the White House criticized a similar Chinese government initiative regarding encryption override several months ago as antithetical to democratic reforms.
Cook’s letter Tuesday said that “while we believe the FBI’s intentions are good … ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”
In the end, a common view is that Apple and other tech firms fear that once a single government request to override access protection – whether it’s called “backdoor” or just a one-time tool – will lead to multiple such “one-time” requests, and eventually to a flood of such demands by governments that foment terror rather than fighting it.
Google CEO Sundar Pichai – whose company has the Android phone operating system, with encryption features similar to Apple’s iPhone system – said Wednesday that “we give law enforcement access to data based on valid legal order. But that’s wholly different than requiring companies to enable hacking of customer devices and data.”
The government’s case is not without support as well, particularly from those who see much less of an apocalyptic view of the order to Apple – and who downplay the wider impact of creating a one-time means to open a specific device. Comparisons are made to long-established legal precedents involving wiretaps and the limiting idea that Apple effectively can meet the order by providing the data to the FBI without handing over the software to unlock the phone. One NBC cybersecurity expert said the circumstance is no different than court orders to Facebook or e-mail services for specific information for a specific accountholder.
Jack Smith, a contributor to mic.com, an online news site targeted to millienials, disputed the idea that meeting the FBI’s request will open any floodgates: “The truth is that there is a protection in place: a warrant. We should fight to make warrants difficult to obtain. But the real unprecedented feat is the idea that a corporation like Apple should be able to prevent our law enforcement from carrying out a lawfully obtained warrant.
On Medium, an online blogging platform aimed at the new tech world, writer Blair Reeves said the public should “.. bear in mind: at no period in American history has there ever been any personal information, let alone any whole class of information, that was ever considered wholly immune to government access. The government has been wiretapping for a century. The FBI accessed bank records to catch mobsters in the ’30s. Location tracking —the old-fashioned way, in person — is as old as government itself.”
The legal thicket involving the Apple-government faceoff is rooted in laws on of the ever-evolving concepts of privacy, first outlined in the late 1880s; in Fourth Amendment protections against “unreasonable search and seizure;” and national security actions and laws that have changed direction over the years – most recently to accommodate threats from foreign terrorists; and the debate over national security and surveillance following leaks” by Chelsea Manning and later by rogue NSA analyst Edward Snowden.
In 1928, in Olmstead v. United States, the Supreme Court said it was legal for federal officers to wiretap suspected bootleggers without a court order because tapping into the phone line did not involve an actual, physical intrusion into a home or business.
But in the late 1960s, in Berger v. New York and more prominently in Katz v. United States, the Court reversed its view about the such “premises” requirements and the legal precepts grew to include a broader “reasonable expectation of privacy” over such things as phone lines that reach outside the “home.” And 2012 in Jones v. United States a case involving police use of a planted GPS device to track the movements of a suspected drug dealer was ruled an impermissible “search” without a court warrant.
In our new tech world of global communication and data-sharing, it’s not the content of phone calls that’s of so much interest as the “metadata” that can be gleaned from phones and stored transactional information about users, devices and activities.
This renewal of a national debate over privacy, security and information will be an important milestone in the evolution of the digital world. Apple’s argument about potential government misuse or criminal appropriation, and the government’s counter that the tradeoff with privacy in certain cases is needed to fight terrorists, will help decide how we balance safety and security in the future against a suspicion about government intrusion into our lives.
In a more pragmatic sense, the spat also is another historical marker in the changes wrought by new technology, in the manner in which the initial reaction by major news organizations to 2007 mass shootings on the Virginia Tech campus was the first time significant portions of staff were devoted to soliciting and using citizen e-mails and mobile phone videos rather to gathering news themselves, first-hand.
One 19th century definition of privacy called it the “right to be let alone.” The 21st century question arising from this Apple technological challenge is whether we add “…except when the government gets to go through your phone data” to that definition.
About the author: Gene Policinski is chief operating officer of the Newseum Institute and senior vice president of the Institute’s First Amendment Center. Email: gpolicinski[at]newseum.org. Twitter: @genefac.
This article was published by the Newseum Institute.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment