While Brazil's National Data Protection Authority (ANPD) struggles to meet its General DataProtection Act (LGPD) regulatory schedule for the two-year period of 2021 and2022, Europe is making strides in regulating and guiding various aspects of the GeneralData Protection Regulation (GDPR), as shown below.
The European Data Protection Board(EDPB), an independent body basedin Brussels, Belgium, working towards the consistent application of dataprotection rules across the European Union and promoting cooperation among theEuropean community's data protection authorities, edited guidelines to interpreting the GDPRwith regard to the interactivity of virtual voice assistants, or “VVAs”, such as Cortana, Alexa, Siri, Bixby, etc.
As the technology advances,VVAs have become tremendously popular, also drawing Hollywood's attention,which has released at least two films on the subject “Her”, starring JoaquinPhoenix, and “Jexi”, starring Adam DeVine. Both films are amazing.
As to the guidelines, thereis a growing concern by authorities that VVAs have a growing access to user'sprivate information which, if poorly managed, can cause damage to users andviolate rights to data protection and privacy.
According to a technicaldefinition, a VVA allows performing different tasks such as sound capture andreproduction, automatic speech-to-text, automatic language processing, dialoguecapability, access to ontologies (data sets and concepts structured regarding agiven domain) and to external sources of knowledge, language generation, speechsynthesis (text-to-speech), and so forth. VVAs can currently be implemented invarious devices such as cell phones, cars, computers, TVs, watches, etc.
They are composed of threemain components:
1. A physical component – the hardware element in which the assistant is embedded (smartphone, speaker, smart TV, etc.), containing microphones, speakers, networking, and computing (developed case-by-case).
2. A software component – the part implementing strictly human-machine interaction and integrating automatic speech recognition modules, natural language processing, dialogue, and speech synthesis. This may be operated directly within the physical component, although it's mostly done remotely.
3. Resources – external data such as content databases, ontologies, or business applications providing knowledge (e.g. “tell me what time it is in the UK”, “read my emails”, etc.) or allowing for the requested action to be concretely carried out (e.g. “increase the temperature by 2°C”).
A VVA may involve differentactors in the course of its execution, as described below:
1. The VVA provider (or designer), responsible for its design, functionality, activation, architecture, data access, record management, hardware specifications, etc.
2. The application developer, creating applications to develop the VVA's functionalities, respecting the limitations imposed by the provider.
3. The integrator, manufacturing the equipment connected by a VVA.
4. The owner, responsible for the physical spaces receiving people (lodging places, professional environments, car rental, etc.) and who wishes to provide a VVA to their audience (possibly with dedicated applications).
5. The user, which uses the VVA on multiple devices: speaker, TV, smartphone, watch, etc.
According to the guidelines,the applicability of Article 5, item 3, from the E-Privacy Directive was consolidated for each of the above-described actor wishingto store or access information stored on the terminal equipment of a EUsubscriber or user. The provision is as follows:
“3. Member States shallensure that the storing of information, or the gaining of access to informationalready stored, in the terminal equipment of a subscriber or user is onlyallowed on condition that the subscriber or user concerned has given his or herconsent, having been provided with clear and comprehensive information, inaccordance with Directive 95/46/EC, inter alia, about the purposes of theprocessing. This shall not prevent any technical storage or access for the solepurpose of carrying out the transmission of a communication over an electroniccommunications network, or as strictly necessary in order for the provider ofan information society service explicitly requested by the subscriber or userto provide the service.”
In addition, the guidelinesare clear in that data processing must be supported by one of the 6 legal basesprovided for in Article 6 from the GPDR, i.e., (i) consent by the data subject,(ii) performance of a contract to which the data subject is party, (iii) legalobligation, (iv) protect the vital interests of the data subject or of another naturalperson, (v) public interest or in the exercise of official authority vested inthe controller and (vi) legitimate interests.
Thus, especially due toArticle 5, item 3, from the E-Privacy Directive, the controller shall informthe data subject of all purposes of the processing; which leads to thereasoning that, out of the six above-described legal bases, the consent by thedata subject must be the legal basis generally chosen, as Article 6 from theGDPR cannot be invoked by controllers to reduce the additional protectionprovided for in Article 5, item 3, of the E-Privacy Directive. However, twoexceptions are provided: (i) carry out or facilitate the transmission of acommunication over an electronic communications network and (ii) provide aninformation society service explicitly requested by the subscriber or user.
Incidentally, accidentalactivations by the user, such as the VVA misinterpreting a speech or a command,cannot be considered as valid consent. In this case, the deletion ofaccidentally collected personal data, if any, is recommended.
The EDPB's interpretation isalso noteworthy. It considers voice data as biometric personal data, which, forthe purposes of the Brazilian LGPD, would be considered sensitive personaldata.
Some recommendations arementioned in the guidelines:
1. When users are informed about the VVA processing of personal data using a user account's privacy policy and the account is linked to other independent services (e.g. email or online purchases), the EDPB recommends the privacy policy to have a clearly separated section regarding the VVA processing of personal data.
2. The information provided to the user should match the exact collection and processing that is carried out. While some meta-information is contained in a voice sample (e.g. stress level of the speaker), it is not automatically clear, whether such analysis is performed. It is crucial that controllers are transparent on what specific aspects of the raw data they process.
3. It should at all times be apparent which state the VVA is in, i.e., users should be able to determine whether a VVA is currently listening on its closed-loop circuit and especially whether it is streaming information to its back-end. This information should also be accessible for people with disabilities such as color blindness, deafness, etc.
4. Particular consideration should be applied if devices allow adding third party functionality (“apps” for VVAs), as the user might be not sufficiently informed as to how and by whom their data is processed.
5. Users should be informed of the purpose of processing of personal data and that purpose should accord with their expectations of the device they purchase. In case of a VVA, that purpose, from a user’s point of view, clearly is the processing of their voice for the sole purpose of interpreting their query and provide meaningful responses (be that answers to a query or other reactions like remote-controlling a light switch).
6. When the processing of personal data is based on consent, such consent “should be given in relation to one or more specific purposes and the data subject should have a choice in relation to each of them”. Moreover, “a controller that seeks consent for various different purposes should provide a separate opt-in for each purpose, to allow users to give specific consent for specific purposes”.
7. From a user's perspective, the main purpose of processing their data is querying and receiving responses and/or triggering actions like playing music or turning on or off lights. After a query has been answered or a command executed, the personal data should be deleted unless the VVA designer or developer has a valid legal basis to retain them for a specific purpose.
8. Before considering anonymization as means for fulfilling the data storage limitation principle, VVA providers and developers should check if the anonymization process renders the voice unidentifiable.
9. Configuration defaults should reflect these requirements by defaulting to an absolute minimum of stored user information. If these options are presented as part of a setup wizard, the default setting should reflect this, and all options should be presented as equal possibilities without visual discrimination.
10. When during the review process the VVA provider or developer detects a recording originated on a mistaken activation, the recording and all the associated data should be immediately deleted and not used for any purpose.
11. VVA designers and application developers should provide secure state-of-the-art authentication procedures to users.
12. Human reviewers should always receive the strictly necessary pseudonymised data.
The legal agreements governing the review should expressly forbid any processing that could lead to the identification of the data subject.
13. If emergency calling is provided as a service through the VVA, a stable uptime should be guaranteed.
14. Voice models should be generated, stored and matched exclusively on the local device, not in remote servers.
15. Due to the sensitiveness of the voiceprints, standards such as ISO/IEC 24745 and techniques of biometric model protection should be thoroughly applied.
16. VVA designers should consider technologies deleting the background noise to avoid recording and processing background voices and situational information.
17. If voice messages are to be used to inform users, the data controllers should publish such messages on their website so they are accessible to the users and the data protection authorities
Children can also interact with VVAs or create their own profilesconnected to that of adults.Some VVAs are built into devices specificallyaimed at children. When the legal basis for processing is consent and inaccordance with Article 8, item 1, from the GDPR, processing of the personaldata of a child shall be lawful where the child is at least 16 yearsold.Where the child is below the age of 16 years, such processing shallbe lawful if given or authorized by the holder of parental responsibility overthe child. However, if the legal basis for processing is the performance of acontract, the conditions for processing the personal data of a child willdepend on the Member State's contractual laws.