On July 24, 2019, the FTC announced a $5 billion settlement with Facebook to address Facebook’s alleged violations of the FTC Act and its 2012 consent order with the FTC. The settlement comes as no surprise to the privacy community – Facebook has been closely scrutinized by the public and regulators since the Cambridge Analytica data incident in March 2018 and indicated to investors earlier this year that it anticipated a fine from the FTC between $3 and $5 billion.
We have read the complaint, settlement, and press releases issued by the FTC and Facebook, and provide our thoughts below on what it means for business:
Privacy compliance matters in 2019
The settlement is attracting significant press, in part, because of the size of the fine. Facebook must pay $5 billion, which is the largest penalty ever by a U.S. regulator for privacy violations. According to the FTC, the fine constitutes nearly 9% of Facebook’s annual revenue from 2018. That fine is more than two times the maximum fine set out by the GDPR (4% of annual revenue) and nearly 20 times the size of the Equifax settlement of $275 million. While the fine is seemingly quite large, many in the privacy community view the fine as insufficient. FTC Commissioner Rohit Chopra issued a dissenting statement that the settlement does little to address Facebook’s unjust gains from its data practices. Providing further support to this perspective, Facebook posted strong quarterly earnings the same day the FTC announced the settlement. While we can debate the sufficiency of the fine, it is clear that privacy compliance is a paramount concern for regulators and the public in 2019. Failure to incorporate privacy by design has real monetary consequences for business, which will amplify once the California Consumer Privacy Act (“CCPA”) takes effect in January 2020.
The $5 billion penalty is a result of Facebook’s alleged violation of its 2012 consent order
The complaint asserts six counts against Facebook. Five of the six counts stem from alleged violations of a 2012 consent order entered into between the FTC and Facebook. According to the FTC, Facebook violated the 2012 order by misrepresenting the extent to which users could control the privacy of their data, misrepresenting the extent to which Facebook made user data accessible to third parties, and failing to implement and maintain a comprehensive privacy program. Only the sixth count asserts a violation of the FTC Act for deceptive practices. The FTC most certainly structured the complaint this way to have a strong basis for fining authority – violations of a consent order carry civil penalties of $42,530 per violation. The takeaway is that while privacy is incredibly important to the FTC, the FTC likely would not have issued such a large penalty against Facebook absent the prior consent order.
European regulation continues to influence U.S. practice
While the fine is noteworthy, the most significant aspect of the settlement is that Facebook must implement a comprehensive and detailed privacy program overseen by compliance officers who can only be removed by an independent privacy committee. As part of the program, Facebook must assess and document internal and external risks to privacy and security, including through the use of impact assessments for any new or modified products – where a product presents a “material risk” to privacy, Facebook must evaluate the risks to privacy and potential safeguards. This impact assessment requirement is similar to a concept found under the GDPR – where a business believes a processing operation is likely to result in a “high risk” to the rights and freedoms of natural persons, the business must carry out a data privacy impact assessment. The FTC’s inclusion of such requirement reinforces that EU data protection regulations continue to influence U.S. business – in the past year, we have seen companies dramatically change their practices to comply with the EU General Data Protection Regulation (“GDPR”) and U.S. states propose and pass their own laws (including the CCPA) with GDPR-like requirements. As a practical matter, the settlement demonstrates that business efforts to comply with the GDPR can help address requirements under U.S. law and regulator concerns. That being said, EU requirements differ from those in the U.S., and businesses need to independently evaluate their compliance with U.S. law.
Expect the obligations imposed on Facebook to flow down through the advertising ecosystem
According to the complaint, Facebook required third parties (including developers) to agree to its terms, but failed to conduct due diligence on third parties’ practices. Where Facebook discovered a violation, Facebook allegedly did not enforce its terms or consider the amount of money generated by the third party as part of determining whether to enforce its terms. The settlement requires Facebook to take responsibility for the actions of third parties on its platform. Facebook must require third parties to disclose their data practices and self-certify to their compliance with Facebook’s terms; deny or terminate access to third parties that fail to certify to the terms; monitor third party compliance with the terms (including through review and audits at least once every 12 months); and take enforcement action with respect to violations of the terms. These obligations are notable because they directly impact advertisers and publishers that use Facebook’s platform. Although Facebook rolled out new terms last year for Custom Audiences and other services, companies should expect further tightening of the terms and auditing oversight, which may limit certain data practices on the platform. The flow down obligations increase the need for companies to conduct their own due diligence around the data they provide to the Facebook platform and service providers and vendors to whom they provide data from the Facebook platform.
Data security is still paramount
Although not explicitly discussed in the complaint, Facebook faced repeated scrutiny in 2018 and 2019 for its data security practices, including in connection with the Cambridge Analytica data incident and storage of account passwords in plaintext. The settlement requires Facebook to implement and maintain a comprehensive security program. Among the various obligations, Facebook must secure user passwords when stored and in transit, implement scans designed to detect whether user passwords are stored in plaintext, refrain from requesting that users provide passwords to third party websites, and report incidents affecting 500 or more users. All these requirements by the FTC are good examples of baseline measures companies should take to secure data they process. In addition, the FTC imposed strict access and deletion periods following a user request – deletion within 120 days of termination of an account and prevention of access by third parties on the platform within 30 days of termination of an account. As a takeaway, poor security and data minimization practices increase the likelihood of security incidents, which are easy targets for regulators and plaintiffs’ attorneys (in particular under the new private right of action under the CCPA).
Be careful not to misrepresent choice or make it difficult for users to exercise choice
The complaint alleges that Facebook informed users that they could exercise choice over their privacy by making their comments public or private, but did not inform users that such settings did not affect Facebook’s sharing of data with third parties. The complaint further alleges that while Facebook did offer controls for users to limit their sharing of data with third parties, such settings were not readily apparent to users and often hidden in menus without sufficient disclosure. According to the FTC, while Facebook at one point disclosed the settings through a disclaimer at the top of its desktop settings page, Facebook removed such disclaimer after four months. The takeaway for business is that disclosure and choice need to be clear, conspicuous, and accurate. While marketing and sales teams may want to limit or hide disclosures and choice to create a more frictionless experience, regulators could deem such practices as unfair or deceptive.
Do not use data for undisclosed or unexpected purposes
According to the complaint, Facebook requested telephone numbers from users in order to provide them with two-factor authentication and password recovery, but then used those numbers for advertising purposes. The settlement prohibits Facebook from using or sharing those phone numbers for advertising purposes. As a learning for business, while marketing and sales teams may want to use information collected for a specific purpose for another purpose, such practice is risky because it does not conform to user expectations. Before engaging in any processing operation, a company should ask whether a reasonable person would expect the company to use the data for such purpose. If the answer is no, the company should reevaluate the processing operation.
Facial data is a sensitive category of a data that requires increased protections
According to the complaint, Facebook uses facial recognition technology to identify users on the platform. When Facebook first introduced the technology, Facebook automatically turned facial recognition on by default. Facebook later decided to turn off facial recognition by default and notified its users of the change to its policy. The FTC alleges that, despite such notice, Facebook continued to keep the facial recognition technology on by default in certain cases, even in cases where the user did not opt-in to the technology. The settlement requires Facebook to provide clear and conspicuous disclosure and obtain affirmative express consent for use of facial-recognition, except when used for fraud prevention and other limited purposes. The takeaway is that facial recognition data is on the radar of regulators and requires a high degree of care with clear notice and often opt-in consent. Facial recognition technology may also implicate certain U.S. state laws, including the Illinois Biometric Information Privacy Act, which has a private right of action.
Keep records of data processing operations
The complaint alleges that Facebook retained insufficient records around its sharing practices with third parties. Under the settlement, Facebook must create robust records and undergo privacy assessments by independent third parties and compliance monitoring for 20 years. The GDPR and other privacy regulations impose strict record keeping obligations on companies, and the FTC’s settlement reinforces the importance of abiding by such obligations.
Investors need to know about data practices
On the same day as the settlement, the SEC also issued a press release that Facebook agreed to pay $100 million to settle charges that it failed to disclose data misuse and the Cambridge Analytica incident. This settlement underscores that privacy compliance is incredibly important to investors and the SEC, and non-compliance may pose a material risk to business.