Tech

Privacy: Google stops evaluation of voice recordings in the EU

About every 500th Google Assistant recording is intercepted and transcribed by Google employees. These include not only very private and intimate, but also some unwanted audio recordings, as a whistleblower could recently show. Against this background, the Hamburg Data Protection Commissioner Johannes Caspar an administrative procedure opened to prohibit Google for a corresponding evaluation for three months. As part of the process, Google has stated that no voice commands would be transcribed across the EU and for at least another three months.

Job market

  1. University of Applied Sciences Aachen, Jülich
  2. Dataport, different locations



Google is actually the data protection authority in Ireland responsible, as the company has its headquarters there. Here also the final decision must be made. Under the General Data Protection Regulation, however, data protection authorities in other Member States also have the option of adopting measures in their territory or area of ​​responsibility for a period of no more than three months if there is an urgent need to protect those affected. "This is the case here, because an effective protection of those affected from intercepting, documenting and evaluating private conversations by third parties can only be achieved by a timely execution", the data protection officer justifies his intervention. He called for the competent authorities to implement appropriate measures for other language assistance systems as well.

The voice recordings of Alexa and Siri are also intercepted and evaluated by humans. Again and again, the language assistants are activated unintentionally and send recordings to Google, Apple or Amazon – sometimes with very private content to conversations about love life, health or intimate acts.

"The use of language assistance systems in the EU must follow DSGVO's privacy policy, and there is considerable doubt in the case of the Google Assistant," says Caspar. Affected parties would need to be informed transparently about the processing of voice commands, but also about the frequency and risks of misactivations. 'Finally, the need to protect third parties affected by the voice recordings must be adequately taken into account', says the state data protection officer. Ultimately, the data protection authorities would have to decide on the final measures that would be necessary for a privacy-compliant operation of the language assistants.