In this day and age, there is a proliferation of social media networks which enable people from different ages and walks of life to connect and stay in touch. One such network is Discord Inc., a real-time chat platform, with text, voice and video communications services in desktop, browser and mobile versions (collectively, "Discord"). Discord was originally created to make it easier for video gamers to communicate, with it being especially popular among multiplayer videos gamers, esports gamers, and game streamers. It has since attracted the attention of influencers, YouTubers, or simply people wanting to hang out with friends, finding refuge from trolls and the greater chaos of the internet. Once inside the platform, users can join different servers, with each functioning as its own community. And within each server, users can plug into various hashtag channels (which can be designated as public or private by the server administrator). Think AIM meets Slack. In addition to group conversations, users can have private conversations with any user they have added as a friend or with any user who is a member of a server the user has joined.
Of course, a social media platform such as Discord is likely to garner the attention of children and teens, which prompted the Children’s Advertising Review Unit (“CARU”) to take a deeper dive into the platform through its routine monitoring practices. CARU noticed that Discord has channels, which feature video games that have a large number of users under 13. Many YouTube influencers with large tween and teen followers also have channels on Discord which they encourage their viewers to visit. As a result, CARU was concerned that Discord may attract a large number of children under 13, and thus may be considered directed towards children under CARU's Guidelines and Section 312.2 of the Children’s Online Privacy Protection Rule (the "COPPA Rule"). If so, CARU wanted to ensure Discord was not collecting and allowing the disclosure of personally identifiable information from children without first obtaining verifiable parental consent.
Discord alleged the platform is primarily aimed at adults, and secondarily at teens. As evidence of this assertion, Discord noted that it is not the typical social media platform because (i) it does not have browsable public profiles, (ii) there is no "like" count feature, and (iii) there are no news feeds. In addition, Discord referenced its original intended use for in-game chat for gamers, and noted a recent report by the Entertainment Software Association which determined that the average age of gamers is 33.
Discord stated that is has taken several steps to ensure that its platform is and remains directed primarily towards a general audience, promotes the online safety of its users, and prevents children under 13 from joining the social network, which includes the following actions:
- Not posting its mobile app in the “Kids” or “Family” section of any app store. In addition, Discord is rated as "Teen" in Google Play and "12+" in the Apple App Store.
- Not marketing or advertising Discord to individuals under 13.
- Designing an adult-oriented and challenging user interface.
- Using a paid subscription business model featuring the high price point of $9.99 per month or $99.99 per year. In addition, Discord's business model has never been supported by behavioral advertisements or the monetization of user data.
- Creating a Trust and Safety team (constituting over fifteen percent of its workforce) to investigate, review reports and complaints, and enforce Discord’s policies and guidelines, including use of the platform by underage users. Reports of users under 13 are promptly investigated, and Discord immediately removes users who are under 13.
- Implementing a neutral age-gate.
CARU Case #6413
"Though it isn’t always the case, the outcome we hope for is proactive corporate accountability on children’s privacy, and that is exactly what Discord delivered." -- Dona Fraser, Senior Vice President Privacy Initiatives, and Director of CARU