Voluntary data sharing between private and government entities has existed as long as tech companies have. From its very early days, Google forged close ties with the government, as author Yasha Levine explains in his book, Surveillance Valley: The Secret Military History of the Internet. In its role as government contractor, the author describes Google's realization that “the same platforms and services that [it] deploy[ed] to monitor people’s lives and grab their data could be put to use running huge swaths of the U.S. government, including the military, spy agencies, police departments and schools.” As a token of how indispensable Google had become to the United States government, the tech company entered into a secret deal with the NSA in 2010. Google “agreed to provide information about traffic on its networks in exchange for intelligence from the NSA about what it knew of foreign hackers,” according to defense reporter Shane Harris, author of @War: The Rise of the Military-Internet Complex. Levine contextualizes this pact in his book, saying, “This made perfect sense. Google servers supplied critical services to the Pentagon, the CIA and the state department, just to name a few. It was part of the military family and essential to American society. It needed to be protected, too.”
As we have documented at The Privacy Issue, pacts with U.S. intelligence are a mix of cooperation with, and infiltration by, three-letter agencies. Though nearly every major Big Tech company has cooperated with the NSA's surveillance programs in some way, the Google cloud was notably backdoored via clandestine means that were likely unknown to the company, with NSA agents gleefully bragging about the infiltration in an internal presentation.
Informal Data Collection
Companies continue to freely share their data repositories with government officials and, though there is occasional pushback, this data sharing has become common practice. In 2019, Buzzfeed News reported that the leading DNA testing company, FamilyTreeDNA, gave the FBI access to almost two million profiles for the purpose of identifying victims and suspects in unresolved crimes – without user knowledge. This marked the first time that a commercial genetic testing company has shared data with law enforcement at its own discretion. The backlash from users and privacy advocates was significant. As genealogist and author Debbie Kennett told Buzzfeed News, “The real risk is not exposure of info[rmation] but that an innocent person could be swept up in a criminal investigation because his or her cousin has taken a DNA test.” FamilyTreeDNA – which has previously positioned itself as highly-protective of user data – has since acknowledged its actions and apologized, though they have not revoked the U.S. government’s access.
Law enforcement has become increasingly determined to get their hands on information from apps and service providers, and will often compel anyone with access to that information to hand it over, including family members. In a 2020 case, Vermont police convinced a child to hand over data on his father via the Life360 tracking app, even upgrading the app to give police richer information on the father's location history. Even without direct access to Life360's data centers, this shows the capability of police and government officials to coerce individuals during the course of an investigation.
Organizations are perhaps more likely to bend to pressure, especially when it can be applied without the approval of the U.S. secret Foreign Intelligence Surveillance Court (FISC). In 2017, the U.S. government made it clear that it did not require FISC to rubber-stamp its actions:
The Attorney General and Director of National Intelligence may direct, in the form of a written directive, an electronic communication service provider to "provide the Government with all information, facilities, or assistance necessary to accomplish the acquisition."... To the extent that a provider does not fully provide such information, facilities, or assistance, FISA provides a means for the government to require the provider's compliance.
Stating that the Foreign Intelligence Surveillance Act (FISA) is only necessary as a fallback mechanism is brazen, and FISC has now ruled that even the data gathered via Section 702 of FISA is not safe from inappropriate snooping by the FBI. The court was appalled by the "unduly lax" surveillance culture at the FBI, in which agents "conducted tens of thousands of unjustified queries of
Section 702 data.” This was a "routine and encouraged practice.” As we'll discover when we look closer at procedural controls, the FBI has a history of combing through data it should not be authorized to access.
Procedure vs. Reality
Much of the data that the U.S. government obtains from tech companies is not voluntarily-shared or accessed informally, instead requiring a well-defined procedure and legal process. According to Greg Nojeim, Senior Counsel at the Center for Democractic Technology (CDT), the forced handover of data can be divided into three main categories: content, traffic data, and subscriber information. Content is the substance of what a person actually says in their communications, while traffic data is the record of who the parties were, when they communicated, and for how long. Subscriber information allows law enforcement to identify a person via their email address or residential address, for example. Generally, the most sensitive of the three types of information is subscriber information. A more demanding legal process is required for content than for the other types of data.
The procedure for compelling data sharing is fairly complicated, explains Nojeim. In order for law enforcement to legally compel a company like Google to turn over all of the data in a person’s account over a certain time period of time, the high legal burden of probable cause must be established. This means law enforcement must prove that there is strong evidence of a crime to a judge in order to obtain an order compelling the disclosure of that information. If law enforcement's case is successful and the order is granted, they can comb through the data they receive using keywords and other selectors to look for potentially-incriminating information.
According to Nojeim, with so much information about our lives now available through third-party tech companies, questions frequently arise about evidence discovered during the course of an investigation. For example: “What happens when law enforcement is looking for evidence of child pornography, and it finds evidence of tax evasion instead?” In the U.S., law enforcement is required to return to court and get a warrant to use that incidentally-found evidence. There are further questions on how long law enforcement can hold onto the information that it compelled disclosure of, Nojeim emphasizes, “because eventually, given how much data about us is floating around, it could be that Big Brother becomes very, very big. We are struggling with those questions now.”
Though local and state law enforcement must often meet a high burden of proof to obtain permission to compel data sharing, the standard is much lower for federal agencies such as the NSA and the FBI. “When the FBI wants to comb through data, it currently needs very little justification to do that,” says Nojeim. Thanks to the court rulings disclosed in 2019, we now know for certain that FISC had little or no control over the FBI's access to information.
As one might expect, secrecy is a key facet of compelled data disclosures by the intelligence community. Section 702 authorizes FISC to issue so-called “gag orders” that forbid companies from disclosing the mere fact that they received such a data disclosure request, and levy fines or even impose jail time for companies who disobey. While defendants in the U.S. have a Constitutional right to see the evidence that law enforcement is using against them, Nojeim points out the irony that, when it comes to intelligence investigations, “information can be collected, combed through and analyzed secretly, and the target of the investigation will never know that that happened, unless that person is charged with a crime.”
The secrecy that covers the forced disclosure of content, data traffic, and subscribed information is subject to certain limitations. As Nojeim explains, the government must prove a continuing need for secrecy on more-or-less an annual basis. If that need can no longer be proven, then companies are permitted to make the disclosure. Many companies follow the practice of releasing regular transparency reports which detail how frequently government agencies request data from them – though the U.S. Department of Justice only permits companies to disclose the number of national security requests in increments of 250. A July 2018 report from the Center for Strategic and International Studies found that 130,000 requests for access were submitted to six of the world’s largest tech companies in 2017. 80% of those which were granted. Further, United States telecommunications providers such as Comcast, Verizon, and AT&T received over 500,000 requests to share location data and communications in the same year. The report stated that the number of requests – as reported by the companies – increased significantly over time. Apple, Facebook, Google and Microsoft – among other service providers – publish the latest data they are permitted to share on their websites. In some cases, service providers will also publish a warrant canary alongside a transparency report.
The Data Sharing Battlefield
In addition to publishing transparency reports, companies are pushing back against compelled data handover by fighting the government in court. In 2016, the FBI brought a lawsuit in an attempt to compel Apple to obtain data from an iPhone belonging to the San Bernardino shooter, who had killed 14 people in a terrorist attack. Apple refused, citing the need to preserve the security of their users and their systems. The FBI eventually dropped the case after contracting its own hackers to access the data, but Apple gained a reputation for taking a stand against the government in privacy-related cases. The method used to access the iPhone has never been made public, though security researchers have made educated guesses.
The clash between the government and companies for control over user data has grown since 2016, with tech giants like Microsoft leading the charge. In 2016, the company sued the Department of Justice over its regular issuance of extensive gag orders. “The proportion [of demands to disclose stored data] was significant – gag orders were attached to well over half of the law enforcement demands they had received,” Nojeim explains, “[In 2017] they settled the case, and the Department of Justice agreed to limitations on its issuance of gag orders.” Even Google – with its reputation for close cooperation with the NSA – brought a lawsuit in an attempt to resist gag orders in 2013, arguing that the enforced secrecy infringes on the company’s First Amendment rights.
Another frontier where the security-versus-privacy battle is being waged is encryption – a discussion at the heart of what is sometimes referred to as the “going dark” debate. As more and more tech companies improve their own security systems, responding to the global push for better privacy, the adoption of end-to-end encryption (E2EE) is gaining traction. E2EE communications systems are transmitted through a secure connection, which only allows the sender and recipient of the communication to read it. Since Facebook announced its intention to introduce E2EE across Messenger and Instagram in addition to WhatsApp, the stakes have been raised. Implementation of E2EE across Big Tech services like Facebook's would prevent the government from monitoring a huge number of messages it would otherwise be able to surveil. Unsurprisingly, governments around the world are protesting the move.
In an open letter published in 2019, William Barr, President Trump’s attorney general – who in 1992 was responsible for authorizing a mass surveillance programs without proper judicial oversight – called on Facebook to halt its encryption roll-out, and give governments a “back door” to encrypted messages and calls. This would allow them to bypass the barrier of encryption. Barr’s open letter was co-signed with the U.S. Secretary for Homeland Security, the UK Home Secretary, and the Australian Minister for Home Affairs – officials from three countries of the Five Eyes alliance.
As The Privacy Issue documented in our article on journalist repression, the Five Eyes countries share data between their intelligence agencies and enjoy mutual and unrestricted access. The Five Eyes alliance has long been pushing for unlimited cross-border data sharing provisions. Their 2018 open memo demanded that tech companies create special solutions for them, threatening legislation that would force companies to comply if they did not do so voluntarily. Such legislation has already been passed in the United Kingdom and Australia.
In response, tech companies are arguing that encryption backdoors not only undermine customer privacy, but also the security of their services. These backdoors subject their technology to the ultimate authority of law enforcement, while also making it easier for data to be accessed by hackers, criminals, and the intelligence operations of nation-state actors. On the same day that Barr’s letter was published, a group of prominent international civil liberties and human rights organizations released an open letter, urging Facebook to resist the calls for encryption backdoors. According to former NSA contractor and whistleblower Edward Snowden, the reason behind government protests against E2EE is “less about public safety than it is about power: End-to-end encryption gives control to individuals and the devices they use to send, receive and encrypt communications, not to the companies and carriers that route them." In an op-ed for The Guardian, Snowden continues:
What this shift [towards encryption] jeopardizes is strictly nations’ ability to spy on populations at mass scale, at least in a manner that requires little more than paperwork. By limiting the amount of personal records and intensely private communications held by companies, governments are returning to classic methods of investigation that are both effective and rights-respecting, in lieu of total surveillance.
Former FBI veteran and chief executive of Cyber Defense Labs, Robert Anderson, reiterates Snowden’s concerns: “Our government cannot sacrifice the ability of companies and citizens to properly secure their data and systems’ security in the name of often vague physical and national security concerns, especially when there are other ways to remedy the concerns of law enforcement.” Anderson and other notable security professionals, like Jim Baker, Director of National Security and Cybersecurity at the R Street Institute and former general counsel for the FBI, have called for public officials to “embrace encryption,” emphasizing that enhancing the integrity and confidentiality of networks is in the interest of government as much as it is in the interest of service providers.
Beyond "Going Dark"
The concerns of law enforcement are already being transformed into action – on a global scale. The 2018 CLOUD Act (Clarifying Lawful Overseas Use of Data Act), which establishes procedures for law enforcement requests of data stored in other countries, relaxes government restrictions on cross-border data sharing. In 2019, the U.S. and UK signed a broad data sharing agreement under the act, allowing law enforcement in the U.S. to compel data from entities in the UK and vice-versa.
As Greg Nojeim told The Privacy Issue, the CLOUD Act allows for the exercise of unchecked government power. For example, it fails to require a strong indication of criminality before a foreign government can compel a U.S. provider to make a disclosure of content, and does not require prior judicial authorization of that disclosure. “The agreement falls far short of that score; it says that [subsequent] judicial review of the disclosure already made is good enough,” Nojeim states, “[CDT] doesn’t think that’s good enough.”
The questions that remain unresolved are numerous and multiplying, as the widening scope and legal ramifications of compelled data sharing become more apparent. As Justice Orenstein wrote in his rejection of the U.S. government’s attempt to coerce Apple into breaking open the San Bernardino shooter's iPhone, “The need for an answer becomes more pressing daily, as the tide of technological advance flows ever farther past the boundaries of what seemed possible even a few decades ago.” Since 2016, however, this debate is yet to be constructive, with both sides seemingly talking past each other.
In addition to the open letters and initiatives by cross-industry groups, some tech companies have formulated their own policy recommendations for navigating issues that arise from compelled data sharing. Microsoft has called for the creation of “modern laws that provide law enforcement and national security agencies with appropriate mechanisms to access digital evidence pursuant to lawful process.” This might allow access to digital information only pursuant to lawful purposes, ensuring the right of technology providers to challenge, and requiring rigorous legal processes for more sensitive information.
The legislative frameworks pertaining to data sharing are one of CDT’s major concerns. As Nojeim told The Privacy Issue, these concerns arise from the sheer enormity of information about a person that law enforcement can obtain from one company. CDT believes there should be strict limitations on the scope of the information law enforcement is able to collect on a person. As Nojeim puts it:
They should not be able to go to Google and say "Greg's a bad guy. Give us everything you know about Greg.” You could go to Google and say: “Hey, court, Greg has been communicating with Sally and they have been planning a bank robbery. We want Greg's communications.” That's different. That's particularized, more controlled than just “tell us everything you know about Greg.”
A further issue with the nature of compelled data disclosure, Nojeim explains, is the lack of timely notice – which is necessary to allow individuals to defend their rights:
In the physical world, law enforcement has received permission from a judge to search your home. So they come to your home, knock on your door, they say "Look at this warrant. It permits us to search your home – let us in.” In the digital world... there’s never that confrontation between law enforcement and the target of the warrant. That person doesn't get the automatic notice that happens in the physical world. One of our goals [at CDT] is to try to bring the digital world closer to the kind of notice you get in the physical world.
Bridging the gap of the digital and the tangible is necessary for privacy and autonomy to be protected and, yet, the building of bridges to even talk about the issues can be a difficult task.
The Need For Dialogue
Taking the cumulative effect of these factors into consideration, The Privacy Issue recognizes that there are often competing interests in government and industry. On the one hand, governments have an official mandate to protect national security and, on the other, technology service providers are required to protect the security of their systems and the privacy of their customers. Given the stagnation of constructive global debate on this issue that has occurred in recent years, it is imperative that industry and government prioritize dialogue, reaching consensus on the key issues.
A 2016 report by Harvard's Berkman Klein Center for Internet & Society, Don't Panic. Making Progress on the "Going Dark" Debate, reinforces the need for dialogue in a technological and legal landscape of competing interests. The report was the result of
collaboration by a diverse group of experts from academia, civil
society, and U.S. intelligence, concluding:
The debate over encryption raises difficult questions about security and privacy. From the national security perspective, we must consider whether providing access to encrypted communications to help prevent terrorism and investigate crime would also increase our vulnerability to cyber espionage and other threats, and whether nations that do not embrace the rule of law would be able to exploit the same access. At the same time, from a civil liberties perspective, we must consider whether preventing the government from gaining access to communications under circumstances that meet Fourth Amendment and statutory standards strike the right balance between privacy and security, particularly when terrorists and criminals seek to use encryption to evade government surveillance.
Intergovernmental organizations, civil liberties organizations, and digital rights advocates have already contributed valuable proposals that must be considered in this discussion. These include the CDT’s focus groups, UNHCR’s International Principles on the Application of Human Rights to Communications Surveillance, Access Now’s Transparency Reporting Index, and Lawfare’s proposed framework for cross border data requests. It is time for cooperation to be prioritized so that individual rights can receive the protection they deserve.