Earlier today, my husband decided to prune his in-box and unsubscribed from a few of the mailing lists he joined in the run-up to the 2020 Presidential race.  A simple task, you say.  Hell, no!  One site asked if he was absolutely sure and flashed a red warning sign. Heart pumping, he hit the toggle to unsubscribe anyway, and got a another big red warning that he would lose access to important information.  (He took the unsubscribe plunge anyway.)  And all this from a Good Organization doing Good Work. Not even trying to sell him anything!   

This, though, is exactly the kind of user experience that the FTC’s Dark Pattern Workshop addressed last week.  The workshop shed light on ubiquitous and pernicious online activities designed to drive sales and obtain data in deceptive ways.  It was a very dispiriting day.

So, what is a “dark pattern”?  It’s a user interface that is designed to override a user’s own judgment and cognition.  Dark patterns are techniques used to manipulate users to do things they might not otherwise do.  Examples of these techniques include hiding the fact that a free trial automatically converts to a paid subscription in the sign-up process, or asking a question in a tricky way (if you’re asked if you’re sure that you want to cancel your membership, should you click on “cancel” or “continue”?), or giving only a “not now” option rather than a “no” option to a request to receive push notifications, or using color and fonts to highlight the action you want the consumer to take and obscure those you don’t.  These are just a few.

Dark patterns are not new, and the FTC’s interest in them isn’t either. But, with the advent of artificial intelligence tools designed to optimize the likelihood that a user will take an action desired by the marketer, dark patterns are becoming even more common and widespread.  And enforcement may be challenging, not because the FTC lacks enforcement authority or the mechanisms to bring actions against deceptive marketers.  Rather, as described in the workshop, companies’ use of AI allows for micro targeting and for automatic adjustments to  a user’s interface in real time and in highly personalized ways.  That will make enforcement agencies’ task of collecting evidence and running copy tests that much more challenging.

One particularly troubling issue addressed in the workshop is the fact that dark patterns have an outsize impact on communities of color.  One panelist speaking on a workshop panel dedicated to this issue, Mutale Nkonde, described an investigation by Pro Publica into how a tax prep service offered free help to low-income tax payers but effectively hid the free option, using digital tools to steer users towards services requiring the payment of a fee. Although the tools (characterized as “dark patterns” by Pro Publica) did not steer consumers based on race, they did use income to steer consumers.  As Nkonde noted, given the demographics of the US population, income effectively acted as a proxy for race.  As can zip code and other demographic indicia, resulting in disparate racial impact when these proxies are used in dark pattern practices.  Further, the panelists noted, the documented harm caused by such dark patterns is not just financial, but also shame, embarrassment, waste of time, and invasion of privacy.

The panelists, comprising regulators and self-regulators, academics, web designers and scientists, all talked about the need for more regulation and more enforcement.  And some urged the enforcement agencies to focus their efforts on the large mainstream companies engaging in these practices in order to really get the point across that use of dark patterns is deceptive, illegal and will face consequences.  The FTC is seeking public comment on topics related to dark patterns, which means that a report and perhaps Guidance are on the way.  And watch out for further enforcement.

(Description of graphic for visually impaired readers: photograph of a man's hand manipulating a marionette.)