Digital voice assistants such as Amazon's Alexa, Google Assistant, and Apple's Siri have been listening to and sampling fragments of recordings for years to improve the quality of speech recognition. Apple wants to end this practice worldwide. Just recently, Google announced that by the end of October 2019 employees across the EU would no longer be evaluating voice-overs from the Google Assistant.
A few days ago, a report by the Guardian announced that Siri recordings would sometimes include very private items from Apple employees. It included confidential discussions, business discussions, criminal activities or even sexual acts.
Apple was the first provider of digital assistants to explicitly ask users for permission to listen to employee recordings. The function should be implemented in a later software update, said the Homepod manufacturer Techcrunch, Until then, the practice will be stopped worldwide and put to the test.
Many users were shocked to learn that voice-assisted voice recordings were partly typed by people to improve speech recognition.
It is also about cases where the language assistants failed and activated, although the signal word was not said to wake up. This happens when using smart speakers with Alexa, Google Assistant and Siri again and again. Since the devices should not be talked about at the moment, they can be used to record sensitive data – everything that has just been talked about.
The providers emphasize that the recordings would be anonymized. Users were largely unaware of the practice until a few months ago the first media reports appeared. Amazon, Apple and Google listened to the recordings of voice commands and analyzed them to improve the digital assistants, it says from the providers. Upon subsequent listening, employees should find out what words or sounds triggered the accidental activation in order to adjust the software accordingly.