In 2016, Amazon sold millions of Amazon Echos worldwide. Brands have taken notice, and begun to develop and release third party software integrations, or “Skills” as Amazon calls them, for the Alexa platform as well as for other voice-enabled platforms and devices. Once again, the law is playing catch-up with technology, and we are here to help you navigate the thorny issues.
Alexa, what are voice-enabled platforms and devices?
For purposes of this post, voice-enabled platforms and devices are those that use a combination of software and hardware to continuously listen and respond to the surrounding environment. The most well-known device dedicated to this “always-on” functionality is the Amazon Echo. The Amazon Echo, based on the Alexa platform, uses a far-field microphone to listen at all times for the wake word “Alexa,” and upon identification of that word, records and transmits subsequent sounds over the internet in order to determine and provide an appropriate response. Other examples of voice-enabled devices that combine software and hardware from a single manufacturer include Google Home (based on the Google Assistant platform) and Microsoft Kinect (based on the Windows Cortana platform). While Apple does not offer a standalone device at this time, it provides similar “always-on” functionality through its iOS platform when a user utters the key phrase, “Hey Siri.”
What makes the voice-enabled platforms and devices from Amazon and Google particularly attractive to brands is that they are now widespread among target audiences and allow brands to integrate their own services through the use of APIs. There are two types of APIs offered at this time. The first type of API allows brands to integrate Skills into the voice-enabled platform. For example, a brand could create a weather Skill that responds to users’ questions about the weather. Under this scenario, the brand would develop and make the weather Skill available through the voice-enabled platform, users would find and install the weather Skill through the voice-enabled platform, and users would trigger the Skill through a key phrase, such as by stating the name of the brand and then asking about the weather. Upon triggering of the Skill, the voice-enabled platform operator would record and exchange subsequent information with the brand, including a transcript of the recorded sounds. If you think this process sounds familiar, you are right – it is comparable to the app ecosystems available on smartphone operating systems.
The second type of API has far more expansive implications and allows brands to integrate the voice-enabled platform into their own brand-developed hardware. For example, a refrigerator manufacturer could integrate the voice-enabled platform into its refrigerators to provide smart home functionality to its users. As brands are only just starting to use the second type of API, this post will focus on the privacy concerns implicated by the development of Skills. We will address the integration of voice-enabled platforms in brand-developed hardware in a subsequent post.
Okay Google, what are some of the privacy issues here?
Glad you asked.
One significant issue brands should consider is whether their use of information collected through a Skill complies with the governing privacy policies of the applicable voice-enabled platform. Similar to smartphone app stores, voice-enabled platforms have their own developer agreements that govern use of the platform. Brands need to carefully review these developer agreements to make sure their intended use of the collected information does not violate the provisions within the developer agreements and give the operator of the voice-enabled platform grounds for bringing legal action against the brand or claiming ownership of the data collected by the brand.
Brands should also consider whether they themselves provide their users with sufficient notice and choice with regard to the information they collect through the voice-enabled platform. Again, as with smartphone app stores, voice-enabled platforms provide brands with an option to embed a privacy policy where users download their Skills. Brands should utilize this feature and make sure the posted privacy policy accurately describes how they use the data they collect through the voice-enabled platform. Failure to post a privacy policy or posting an inaccurate privacy policy could constitute an unfair or deceptive act or practice under Section 5 of the Federal Trade Commission Act, as well as a violation of state law such as California’s CalOPPA, and warrant scrutiny by federal and state regulators such as the FTC and State Attorneys General. Brands that intend to use the data for marketing purposes should comply with the self-regulatory guidelines issued by the Digital Advertising Alliance (DAA) and other self-regulatory bodies.
Another issue brands should consider is the type of information they collect and who is providing that information. As with other forms of data collection, brands that target children under the age of 13 will need to be extra careful not to violate the Children’s Online Privacy Protection Act (COPPA). Even if a brand does not target children, voice-enabled platforms and devices currently offer no way to age-screen users, and brands could theoretically face potential liability under COPPA for knowingly collecting information about children if the voice-enabled platform records a child mentioning his or her age. Similarly, brands may unexpectedly find themselves in possession of health-related data, background noise from users who did not explicitly consent to the privacy policy, and even data from users outside the U.S., which could implicate international data transfer laws and/or subject the brand to the jurisdiction of other countries. Brands should consider building into their Skills (a la Privacy By Design) the ability to screen the information they collect, and, to the extent permitted by applicable law, removing information that could create potential liability. Such measures are particularly important in light of the EU General Data Protection Regulation (GDPR), which goes into effect in May 2018.
Voice-enabled platforms and devices also raise unique issues regarding law enforcement access to data. Last month, police in Arkansas issued a warrant to Amazon to turn over records collected by an Amazon Echo to assist with the investigation of a murder. While the police turned to Amazon in this case, given that Amazon declined to give police the requested information, in future cases police may instead seek the information from a brand that operates a Skill installed by the murder suspect. Brands should make sure that their privacy policies preserve the ability to respond to lawful subpoenas, government and regulatory requests, and discovery requests, and that they are prepared to object to such requests where such requests are overbroad or seek information that brands are prohibited by law from disclosing under applicable law.
While there are numerous other issues to consider, one last point that warrants discussion at this time is the “creepiness factor.” In 2015, Samsung received public criticism after updating its privacy policy in such a way that implied some of its televisions collected personal information through voice recognition software. Since then, California enacted Section 22948.20 of the California Business and Professions Code, which requires prominent disclosure of voice recognition features in smart TVs and restricts voice recognition data collection through smart TVs, in particular for advertising purposes. Also, in 2015, Mattel received criticism for its Hello Barbie doll that records and stores conversations with children for processing by voice-recognition software. Brands should remember that while a particular practice may not violate a specific privacy statute, users could view it as an invasion of their privacy, which could impact a brand’s public perception and bottom line.
Hey Siri, how do I find out more?
The legal landscape of voice-enabled platforms and devices is rapidly evolving, and we will continue to monitor and keep you updated on the latest trends.