The Federal Trade Commission announced that it reached a settlement with Everalbum, Inc., the developer of the photo app Ever, resolving allegations that the company deceived consumers about its use of facial recognition technology and its retention of users' photos and videos.  As part of the settlement, Everalbum agreed to obtain consumers' express consent before using facial recognition technology on their photos and videos and also agreed to delete models and algorithms it developed by using the photos and videos that were uploaded by users.  

In the announcement of the settlement, Andrew Smith, the Director of the FTC's Bureau of Consumer Protection, said, "Ensuring that companies keep their promises to customers about how they use and handle biometric data will continue to be a high priority for the FTC." 

The FTC alleged that Everalbum launched a "Friends" feature in 2017, which uses facial recognition technology to group users photos by the face of the people who appear in them.  The FTC said that when the feature was launched, it enabled facial recognition by default for all users of the app -- and didn't give users an option to disable the feature.  Over the next two years (first in 2018, for users in certain jurisdictions that restrict the use of biometric data, and then in 2019, for all other users), it rolled out a process where users got to choose whether to enable the facial recognition technology.  The FTC charged that even though most consumers didn't have the ability over that time to approve the use of facial recognition technology on their photos, Everalbum misled consumers into thinking that the technology would not be used without their approval.  Specifically, the FTC said that the "Help" section of Everalbum's website misled consumers by saying, "When face recognition is turned on, you are letting us know that it's ok for us to use the face embeddings of the people in your photos and videos, including you, and that you have the approval of everyone featured in your photos and videos."  In other words, the FTC is arguing here that the help section communicated to consumers that facial recognition technology wouldn't be used unless consumers had consented to its use. 

The FTC also alleged that Everalbum also misled users about what would happen to their photos when they deactivated their account.  The FTC charged that even though the company's privacy policy promised that photos and videos would be deleted "as soon as possible" after the account was deactivated, many users' photos were not deleted at all, but were retained indefinitely. 

In connection with the settlement, Commissioner Rohit Chopra issued a separate Statement, lauding the fact that the settlement with Everalbum requires the company to delete the facial recognition technologies that were "enhanced by any improperly obtained photos," which he noted was a different approach from other cases, where companies weren't required to give up their "ill-gotten data."  Chopra also criticized the settlement for not including any financial penalty, saying, "the FTC needs to take further steps to trigger penalties, damages, and other relief for facial recognition and data protection abuses." 

While there are many interesting things about this enforcement action, one thing that jumped out at me as particularly significant was that the FTC's false advertising allegations were based, in large part, on a fairly innocuous statement in the "help" section of the company's website.  It's a good reminder that anything you say, no matter where it appears, could form the basis of a false advertising claim.  (Of course, if an advertiser tried to rely on the "help" section to modify its advertising claims, the language would likely be seen as a lot less important . . . .)