The report said Alexa's auditors do not have access to customers' full names or address, but they do have serial numbers and account numbers associated with the device.
The company employs "thousands of people around the world" to listen to people talk to Alexa via Echo speakers, Bloomberg reports. The report mentions that there are chat channels where employees can help each other with particularly tricky transcriptions, and yes, they do share amusing clips. The ex-Amazon workers also said that numerous amusing clips get distributed between employees worldwide in an Amazon internal chat room. "We only annotate an extremely small number of interactions from a random set of customers in order to improve the customer experience", an Amazon spokesperson said. It also notes that employees don't have direct access to identifiable information about users.
'All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it'.
Amazon previously has been embroiled in controversy for privacy concerns regarding Alexa. But Amazon gives Echo device owners the option of disabling the use of their voice recordings, according to the report.More news: Samsung reveals the Galaxy A80 with a sliding, rotating 48MP camera
Amazon employees are listening to what people tell Alexa - the company's voice recognition-based AI software - in the intimacy of their homes, a Bloomberg report revealed Thursday. However, there are occasional clips that lean more towards being too sexually explicit, with some being outright criminal.
However, Amazon reasons that it does mention this fact under its frequently asked questions section, which states that users agree in helping train Alexa's speech recognition and language understanding. However, they were told by colleagues that it was not Amazon's job to intervene.
And yes, that does mean people are possibly giggling at the stupid things you've said to Alexa in the past.
While smart speakers are technically always "hearing", they are typically not "listening" to your conversations. When that happens, the reviewers tick a box that marks the recording as "critical data" and then continue to the next clip. In the Alexa Privacy Settings there is a "Manage How Your Data Improves Alexa" section where you can decide whether your voice recordings are used to "help develop new features" and/or "help improve transcription accuracy".
Of course, Apple and Google both also rely on humans to improve Siri and Google Assistant. Remember that recordings are only retained by Alexa when you've uttered the wake word - although accidental activations are also noted down.