With advances in voice recognition and machine learning, we see an increasing number of products that implement some form of vocal interaction. Popular ones include Google Home and Amazon Echo, as well as smaller digital assistants such as Apple’s Siri and Microsoft’s Cortana. Recently, companies such as Mattel have developed toys with voice interaction features for children, such as the Hello Barbie doll. However, with increases in artificial intelligence come increases in data collection, and with increases in data collection come increases in privacy concerns.
Parents and privacy advocates alike expressed concern over Hello Barbie’s potential to store records of intimate conversations. Concerns intensified when it was reported that the toy was susceptible to remote hacking through wireless networks. The latest companies under fire are Chinese-based toy manufacturer Genesis Toys and US software tech company Nuance Communications. In December 2016, the Electronic Privacy Information Center (EPIC), along with the Campaign for a Commercial Free Childhood, the Center for Digital Democracy, and the Consumers Union have filed a complaint with the Federal Trade Commission (FTC) about legal concerns relating to privacy in two toys, My Friend Cayla and i-Que Intelligent Robot, designed by Genesis with software written by Nuance. The complaint alleges that the companies violate a 1998 law, the Children’s Online Privacy Protection Act (COPPA), due to deceptive data collection without proper notice or parental consent.
The toys use Bluetooth technology to connect to an associated mobile app, which contains user-entered information about the child’s name, parents’ names, school name, etc. In addition to this, the app collects the user’s IP address and requests access to many of the devices functions, including, in the case of the i-Que Robot, the camera, which is never actually used. In addition to this, recordings are stored and uploaded to Nuance’s servers. Nuance’s privacy policy remains vague, and it is possible – and alleged by the complaint – that Nuance shares this data with law enforcement. In fact, Nuance delivers a separate service known as Identifier, which is specifically designed to aid law enforcement using voice recordings.
So how does this relate to COPPA? Well, COPPA outlines several rules for tech companies that collect data from children under 13. Some of these rules include:
- Post a clear and comprehensive online privacy policy describing their information practices for personal information collected online from children
- Provide direct notice to parents and obtain verifiable parental consent, with limited exceptions, before collecting personal information online from children
- Give parents the choice of consenting to the operator’s collection and internal use of a child’s information, but prohibiting the operator from disclosing that information to third parties (unless disclosure is integral to the site or service, in which case, this must be made clear to parents)
- Retain personal information collected online from a child for only as long as is necessary to fulfill the purpose for which it was collected and delete the information using reasonable measures to protect against its unauthorized access or use
The FTC complaint alleges that the privacy policies for both Cayla and i-Que are “confusing and hard to access.” It also argues that the measures taken to obtain parental consent prior to using the toy are insufficient. Finally, it argues that Genesis allows itself to retain data indefinitely, and that security measures to prevent unauthorized access are poor.
Indeed, there may very well be cause for concern. Not only are users subject to the privacy policies for the individual toys; they are also affected by Nuance’s own general privacy policy. Add to this the fact that the toys’ privacy statements tell the user that they are subject to change at any time, and keeping information symmetric between the user and the company becomes difficult. What makes this issue even more confusing is that the different privacy policies are sometimes contradictory. For instance, Nuance states that “if a person under 13 submits information through any part of a Nuance Website or a Nuance Product, and we learn the person submitting the information is such a child, we will attempt to delete this information as soon as possible.” It is unclear if the software in the toys could be considered a “Nuance Product.” The company themselves responded, claiming that they do not sell collected voice data for marketing purposes or share it with any other customers. Whether or not “customers” includes government agencies remains to be seen; however, if it does not, there is no doubt that privacy advocates would find umbrage in these policies. Furthermore, Nuance would possibly face a legal battle concerning the retention of voice data for longer than is necessary to fulfill the specific purpose of improving the toy in question. Possibly the most disconcerting aspect of this case, however, comes in the toys’ poor security. The complaint reads, “Researchers discovered that by connecting one phone to the doll through the insecure Bluetooth connection and calling that phone with a second phone, they were able to both converse with and covertly listen to conversations collected through the My Friend Cayla and i-Que toys.” Considering the emphasis on children’s security outlined in COPPA, Nuance and Genesis would likely have to at least add further security to these toys.
One may wonder: why stop here? Wouldn’t COPPA affect other products that also gather data about children under 13, even if they are not the target audience? In fact, some experts have expressed concern about potential legal violations by Google Home and Amazon Echo, noting that while parents might have the ability to consent at certain times, there may be several other times when they do not. Even if that were not the case, the practicality of obtaining parental consent before every instance of data collection (i.e., every time a child speaks) is questionable at best.
Along with Facebook’s battle over biometric data collection, we see a recurring theme: data collection versus privacy protections. Privacy advocates generally want less collection and retention and greater transparency, arguing that the widespread gathering of data poses a threat to security and civil liberties. On the other hand, tech companies argue that digital information, when handled correctly, can be highly useful in improving goods and services through analysis and machine learning. Perhaps the two groups will strike a healthy balance at some point in the future. For now, however, we will have to observe what happens.