[WASHINGTON, D.C.] – In a letter to the Federal Trade Commission (FTC) today, U.S. Senator Richard Blumenthal (D-CT) submitted evidence that Facebook has failed to abide by the terms of a 2011 consent decree and called for “monetary penalties that provide redress for consumers and stricter oversight.” In today’s letter, Blumenthal laid out the case for stronger standards to protect users’ data by highlighting Facebook’s repeated failures to comply with requirements to “establish and implement, and thereafter maintain, a comprehensive privacy program.”
“Recent revelations about the illegitimate harvesting of personal data on tens of millions of Americans have shed new light on the systemic failure of Facebook to address privacy risks and suggests that Facebook may have ran afoul of its consent decree. Despite Mark Zuckerberg’s recent apology tour, Facebook’s history of negligence demonstrates that the company can no longer be trusted to self-regulate,” Blumenthal wrote.
“Mr. Zuckerberg has acknowledged that the incident was a breach of trust between Facebook and its users, a broken promise that requires redress for consumers and enforceable commitments that deter further breaches. It is time for the FTC to thoroughly and rigorously reassess Facebook’s privacy practices and put into place rules that finally protect consumers.”
Last week, at a joint hearing today of the Senate Judiciary and Senate Commerce, Science, and Transportation Committees, Blumenthal demanded Zuckerberg explain why his company did not take action to protect Facebook users against third-party developers with terms of service agreements that explicitly allowed them to sell their data. During the exchange, Blumenthal questioned Zuckerberg about a previously undisclosed 2014 terms of service document – obtained by Blumenthal in the course of his investigation into Cambridge Analytica’s use of Facebook data to influence voters – that explicitly permitted Dr. Aleksander Kogan to “sell, licence (by whatever means and on whatever terms) and archive your contribution and data.”
The full video of Blumenthal’s exchange with Zuckerberg can be watched here.
The full text of Blumenthal’s letter is available here and copied below.
April 19, 2018
The Honorable Maureen Ohlhausen
Federal Trade Commission
600 Pennsylvania Avenue, NW
Washington, DC 20580
Dear Acting Chairman Ohlhausen,
I am pleased that the Federal Trade Commission (FTC) has opened an investigation into the privacy practices and policies at Facebook. Recent revelations about the illegitimate harvesting of personal data on tens of millions of Americans have shed new light on the systemic failure of Facebook to address privacy risks and keep its promises to users. Despite Mark Zuckerberg’s recent apology tour, Facebook’s history of negligence demonstrates that the company can no longer be trusted to self-regulate. I write to draw attention to information that may be relevant to your investigation, including evidence that Facebook may have violated its consent decree. I also encourage the FTC to pursue strong legal remedies to compensate consumers harmed and set enforceable rules on its future conduct.
In November 2011, Facebook agreed to a proposed settlement containing a consent decree after the FTC found that the company had deceived consumers by sharing personal data with advertisers and making public information previously designated as private. Under the settlement, Facebook was barred from misrepresenting the privacy of personal information and was required to obtain affirmative express consent before enacting changes would override privacy preferences. The FTC also required Facebook to establish “a comprehensive privacy program that is reasonably designed to (1) address privacy risks related to the development and management of new and existing products and services for consumers, and (2) protect the privacy and confidentiality of covered information.”
Facebook’s adherence to the consent decree has been called into question based on recent reports that the political consulting firm Cambridge Analytica and Global Science Research (GSR) had harvested a large-scale dataset of Facebook users based on a third-party app. The GSR app would collect demographic details, private communications, and other profile metrics of those who installed the app and their friends. Based on Facebook’s permissive, default privacy settings, Cambridge Analytica was able to obtain information from up to 87 million profiles based on only about 300,000 users installing the GSR app.
This should have never happened. The FTC put Facebook on notice about the privacy risks of third-party apps in its complaint. Three of the FTC’s claims concerned the misrepresentation of verification and privacy preferences of third-party apps. In 2008, shortly after the launch of its developer platform, Facebook introduced a “Verified Apps” program, which would provide a badge that Facebook had certified the security, privacy, trustworthiness, and transparency of an app. When Facebook announced it would be ending the program the following year, it claimed that it would be extending these trust standards into all apps. However, in its 2011 complaint, the FTC found that despite claims of auditing, Facebook took no steps to verify either the security or protections for collected user information. Seven years later, exactly how Facebook verifies third-party apps is still murky.
The Cambridge Analytica revelations demonstrate that Facebook continued to turn a blind eye to third-party apps despite the FTC mandated privacy program. Facebook should have been aware that GSR was planning to violate developer platform rules based on the policies that developers are required to submit. GSR’s terms of service (“Attachment 1”) stated explicitly that it reserved the right to sell user data and would collect profile information from friends. These terms of service should have put Facebook on notice that GSR may be seeking to sell user data. At this month’s Senate hearing on Facebook, Mr. Zuckerberg informed me that its app review team would have been responsible for vetting the policy and acknowledged that Facebook “should have been aware that this application developer submitted a [terms of service] that was in conflict with the rules of the platform.”
Even the most rudimentary oversight would have uncovered these problematic terms of service. Moreover, Facebook knew as early as 2010 that third-party app developers were selling information to data brokers. The fact that Facebook did not uncover these non-compliant terms strongly suggests that its “comprehensive privacy program” established pursuant to the FTC consent decree was either inadequate to address threats or not followed in practice. This willful blindness left users vulnerable to the actions of Cambridge Analytica.
The Cambridge Analytica matter also calls into question Facebook’s compliance with the consent decree’s requirements to respect privacy settings and protect private information. Three years after Facebook agreed to the consent decree, Facebook by default continued to provide broad access to personal data to third party apps, data that may not have been marked as public. In evaluating claims of deception and misrepresentation of privacy controls, the FTC has typically considered what a consumer would have reasonably understood their settings to mean. No information was readily provided to users about this permissive sharing to third-party apps or how to opt out. Nor were users informed about which apps accessed their profiles or given the ability to resolve unwanted intrusions. While users could be judicious about their privacy settings and the apps they installed, the actions of only one friend could thwart their efforts without their knowledge. The ease with which the GSR app was able to harvest data on 87 million users demonstrates that third parties were effectively able to override privacy preferences without express consent.
It is also noteworthy that the relaxation of data retention policies for third party developers may have contributed to the illegitimate collection of data. In a version of its Developer Principles and Policies dated December 1, 2009, Facebook mandated that developers “must not store or cache any data you receive from us for more than 24 hours” and “must not give data you receive from us to any third party.” In April 2010, Facebook changed this policy to permit developers to keep user information with significantly reduced restrictions on the sharing of data. There is no indication that Facebook informed its users that third parties would now be allowed to store their data or share it.
Facebook had multiple opportunities to prevent this harvesting and notify users before March 2018, but failed to do so. According to former Cambridge Analytica employee Christopher Wylie, the GSR app had collected data so aggressively that it triggered Facebook’s security protocols. However, there is no indication Facebook took steps to investigate or limit the collection despite the problematic terms of service.
Facebook finally acted on the GSR app after The Guardian reported on Cambridge Analytica’s plans in December 2015. While Facebook removed the application and contacted both companies to request the destruction of user information, its response continued to be inadequate. Facebook did not take any steps to prevent Cambridge Analytica and its partners from continuing to use its platform for advertising or analytics services, even working alongside the company within campaigns. It did not provide notice to users about how their information has been harvested by Cambridge Analytica, nor did it inform the FTC about the collection of data without user consent. Facebook did not contact Christopher Wylie to request the deletion of user data until the following August – at least nine months after the initial report. Facebook took no further action to assess whether data had been deleted. The ineffective response calls into question how seriously the company took this incident and others like it.
Former Facebook employees have told me that its staff were not empowered to effectively enforce privacy policies. For example, Sandy Parakilas, who led efforts to fix privacy problems on its developer platform from June 2011 to August 2012, describes Facebook as a company that would not commit resources or attention to protecting users against violations from third-party apps. Mr. Parakilas’ letter to me (“Attachment 2”) along with his November 19, 2017 New York Times op-ed and April 10, 2018 interview with New York Magazine, highlight a deeply disturbing pattern of disregard by Facebook to the privacy risks posed by third-party apps. Mr. Parakilas recounts how one executive told him, after proposing a deeper audit of developers’ use of data, “Do you really want to see what you’ll find?” Had Facebook taken such requests more seriously at the time, the GSR app might have been caught earlier.
Facebook has acknowledged it has neglected its privacy controls, which had non-functional settings and often outdated descriptions did not reflect how the platform operates. Overall Facebook’s privacy controls were arcane and difficult to navigate, preventing users from effectuating their preferences. Such deficiencies indicate that Facebook did not maintain an adequate privacy program that was sufficient to protect users and enable them to exercise informed consent.
We may never know the full extent of the damage caused by the failure to provide adequate controls and protection to users. A month after the recent Cambridge Analytica reports, Facebook has not disclosed information on how many applications engaged in similar data collection, but has stated that it expects to have to audit thousands of suspicious applications. As before, it remains only externally reactive to public reports, for example suspending the company CubeYou after media covered its commercial activities. The Facebook developer platform was launched in 2007 and stronger protections for consumers were not implemented until 2015. Presumably many of those companies that developed platform application have shut down, contact details changed, and record trails lost. While Mr. Zuckerberg has committed to audit suspicious apps, it is clear that Facebook will never be able to fully assess the impact of its years of neglect.
Facebook now bears little resemblance to the company it was at the time of the consent decree, necessitating a vigorous investigation into its privacy practices across its range of products and activities. Since November 2011, its expansion and acquisitions have strengthened the company’s dominance in the social networking market and increased the significance of the challenges posed to consumers. Consumers, civil society, and members of Congress have raised an expansive set of privacy concerns, including its collection of Internet traffic for surveilling competitors; purchase of personal information from data brokers; tracking of non-Facebook users across the web; and harvesting of communications metadata from phones. These allegations raise new issues relevant to the consent decree that should be in the scope of the FTC’s review.
The FTC ordered the consent decree in response to Facebook’s repeated failures to address privacy risks, and put into place rules on how the company should act to protect users. If its investigation find that Facebook has violated the consent decree or engaged in further unfair or deceptive acts and practices, it should seek both monetary penalties that provide redress for consumers and impose stricter oversight on Facebook. The FTC should consider further measures that rigorously protects consumers, such as:
- data minimization standards that requires Facebook to retain and use data only for services expressly requested by users;
- limits on the combining and sharing of data between Facebook-owned services;
- transparency on the types of data that Facebook collects from users and from other sources, and to publicly account for how that data is used;
- restrictions on collection of data from its “social plug-ins,” cross-device tracking, and or data brokers;
- appointment of a third-party monitor to oversee changes to Facebook’s privacy and data use policies and practices, with periodic reinvestigation; and,
- organizational changes to ensure that privacy and data use is protected at all levels.
While the Cambridge Analytica revelations have raised awareness to Facebook’s failure to provide users with adequate information or safeguards to protect privacy, many have raised legitimate and broad-reaching concerns about the company’s practices beyond a single ‘bad actor’ problem. Mr. Zuckerberg has acknowledged that the incident was a breach of trust between Facebook and its users, a broken promise that requires redress for consumers and enforceable commitments that deter further breaches. It is time for the FTC to thoroughly and rigorously reassess Facebook’s privacy practices and put into place rules that finally protect consumers.
Thank you for your attention to this important matter.
 “Guiding Principles.” Facebook Developers. https://web.archive.org/web/20080902015608/http://developers.facebook.com/get_started.php?tab=principles
 “Facebook Shuts Down Apps That Sold User Data, Bans Rapleaf.” AdAge. October 29, 2010. www.adweek.com/digital/facebook-shuts-down-apps-that-sold-user-data-bans-rapleaf/
 “Developer Principles and Policies.” Facebook Developers. December 1, 2009. https://web.archive.org/web/20091223051700/http://developers.facebook.com/policy/
 “A New Data Model.” Facebook. April 21, 2010. https://web.archive.org/web/20120502125823/http://developers.facebook.com/blog/post/378/
 Cadwalladr, Carole. “'I made Steve Bannon’s psychological warfare tool’: meet the data war whistleblower.” The Observer. March 17, 2018. https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump
 “It’s Time to Make Our Privacy Tools Easier to Find.” Facebook. March 28, 2018. https://newsroom.fb.com/news/2018/03/privacy-shortcuts/