Skip to main content

Bloomberg report offers inside look at Amazon’s global team listening to Alexa audio clips

A new report from Bloomberg offers a look at the team of people that Amazon employs to listen to and monitor Alexa voice recordings. The report explains that Amazon employs “thousands of people around the world” to listen to voice recordings captured through its Echo devices.

Some of the employees work full-time for Amazon, while some are independent contractors. In either case, Amazon requires the people sign nondisclosure agreements to keep the details of the program from public eye. The people work all of the world, including Boston, Costa Rica, India, and Romania.

The goal of this tactic by Amazon is “eliminate gaps in Alexa’s understanding of human speech and help it better respond to commands,” the report explains. Inherently, however, there are privacy concerns.

Each reviewer works nine-hour shifts, processing as many as 1,000 audio clips per shift. In some cases, the recordings are rather boring and consist of simply mining “accumulated voice data for specific utterances such as ’Taylor Swift’.” If employees need “help parsing a muddled word,” they might share the audio files in a chatroom with other employees.

Amazon’s review process for speech data begins when Alexa pulls a random, small sampling of customer voice recordings and sends the audio files to the far-flung employees and contractors, according to a person familiar with the program’s design.

In other cases, however, things are a lot more interesting. For instance, the report describes what happens when an Amazon employee hears something that might be considered upsetting, or even criminal. In any case, however, Amazon doesn’t see it as its job to interfere:

Sometimes they hear recordings they find upsetting, or possibly criminal. Two of the workers said they picked up what they believe was a sexual assault. When something like that happens, they may share the experience in the internal chat room as a way of relieving stress.

Amazon says it has procedures in place for workers to follow when they hear something distressing, but two Romania-based employees said that, after requesting guidance for such cases, they were told it wasn’t Amazon’s job to interfere.

In a statement, Amazon explained that it takes security and privacy seriously and only annotates an “extremely small sample” of Alexa voice recordings. The company explains that these random samples help it train speech and language understanding systems, which in turns improves Alexa’s ability to understand requests.

“We take the security and privacy of our customers’ personal information seriously,” an Amazon spokesman said in an emailed statement. “We only annotate an extremely small sample of Alexa voice recordings in order improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.”

“We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.”

Bloomberg’s report also makes mention of Apple’s efforts in regards to Siri. The report explains that while Apple has “human helpers,” the recordings lack personally identifiable information. After six months, data is stripped of its random identification information, but could be stored for longer periods to improve voice recognition.

Apple’s Siri also has human helpers, who work to gauge whether the digital assistant’s interpretation of requests lines up with what the person said. The recordings they review lack personally identifiable information and are stored for six months tied to a random identifier, according to an Apple security white paper. After that, the data is stripped of its random identification information but may be stored for longer periods to improve Siri’s voice recognition.

The full report from Bloomberg is absolutely worth a read and can be found here.

FTC: We use income earning auto affiliate links. More.

ESR iPad Accessories
You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Subscribe to 9to5Mac on YouTube for more Apple news:

Comments

Author

Avatar for Chance Miller Chance Miller

Chance is an editor for the entire 9to5 network and covers the latest Apple news for 9to5Mac.

Tips, questions, typos to chance@9to5mac.com