---
title: "Amazon's Alexa Listening Program: What Bloomberg Found and What Amazon Paid"
description: "In 2019, Bloomberg revealed that Amazon employed thousands of workers to listen to Alexa recordings - including private medical conversations and what appeared to be a sexual assault. Amazon confirmed it. The FTC settlement was $25 million."
date: 2026-04-21
author: "Ben Racicot"
tags: ["Privacy", "Voice", "Local AI", "Security"]
type: "article"
canonical: "https://modelpiper.com/blog/amazon-alexa-listening-program/"
---

# Amazon's Alexa Listening Program: What Bloomberg Found and What Amazon Paid

> In 2019, Bloomberg revealed that Amazon employed thousands of workers to listen to Alexa recordings - including private medical conversations and what appeared to be a sexual assault. Amazon confirmed it. The FTC settlement was $25 million.

## TL;DR

In April 2019, Bloomberg reported that Amazon employed thousands of workers in the US, Costa Rica, and Romania to listen to and transcribe Alexa voice recordings - reviewing up to 1,000 clips per shift. Contractors heard private medical conversations, domestic disputes, and at least one incident that appeared to be a sexual assault. Amazon confirmed the program. In 2023, Amazon paid $25 million to settle FTC allegations. The Alexa case is the founding document of the cloud voice AI privacy problem.

The Bloomberg story ran on April 10, 2019, under the headline "Is Anyone Listening to You on Alexa?" The answer was yes. Thousands of people were.

Amazon employed workers across multiple countries - the United States, Costa Rica, Romania - to listen to and transcribe voice recordings captured by Echo devices. Each worker reviewed up to 1,000 audio clips per shift. The work was described internally as helping Alexa understand accents, interpret requests, and handle edge cases that automated systems flagged for human review.

[Bloomberg's investigation](https://www.bloomberg.com/news/articles/2019-04-10/is-anyone-listening-to-you-on-alexa-a-global-team-reviews-audio) included accounts from people who worked in the program. What they described went beyond linguistic edge cases.

## What did the Alexa contractors hear?

Amazon contractors reviewing Alexa recordings reported hearing private medical conversations, sensitive personal discussions, domestic disputes, children's voices, and at least one recording that appeared to depict a sexual assault. Amazon confirmed the contractor review program, calling it necessary for improving Alexa's accuracy.

Bloomberg's sources described specific incidents that stayed with the contractors. A woman singing in what sounded like a shower. Children screaming in a way that raised concern. And at least one recording where contractors believed they were hearing a sexual assault. The contractors reported the last incident to supervisors. The supervisors told them it wasn't their job to intervene - their job was to transcribe.

A follow-up Bloomberg investigation published April 24, 2019 revealed another detail: [Alexa reviewers could access customers' home addresses](https://www.bloomberg.com/news/articles/2019-04-24/amazon-s-alexa-reviewers-can-access-customers-home-addresses). The recordings were nominally anonymized - associated with account numbers and device serial numbers rather than names. But account numbers linked to addresses. A contractor reviewing a recording also had access to where the device was located.

## What did Amazon say?

Amazon confirmed the program and defended it. Their statement described using "an extremely small number of customer interactions" to improve Alexa's performance, claiming the practice was disclosed in their terms of service and Alexa FAQ. They stated that employees did not have "direct access" to identifying information - a claim the home address reporting complicated.

Amazon also stated that annotators were "required to adhere to Amazon's confidentiality requirements" and that the company had "strict technical and operational safeguards" against misuse. No specific safeguards were described. No mechanism for contractors to report potentially criminal audio was disclosed.

The defense was essentially: we disclosed this, it's normal, and you should trust our safeguards. All three claims were contested by Bloomberg's reporting.

## The FTC settlement

In May 2023, Amazon agreed to pay $25 million to settle FTC and Department of Justice allegations related to Alexa privacy violations. The primary focus of the settlement was children's data - specifically Amazon's failure to delete voice recordings and associated data when parents requested deletion, and retaining children's voice data beyond what was necessary for the service.

The children's data violations were egregious enough to anchor the settlement, but they existed on top of the adult recording program Bloomberg had exposed four years earlier. The $25 million represented one part of a broader accountability picture for Amazon's voice data practices.

## Why the Alexa case matters most

Amazon was first. The April 2019 Bloomberg story preceded the Google and Apple revelations by three months. It established the framework that the subsequent investigations followed: cloud voice AI, human contractors, access to private conversations, disclosure buried in terms of service, company confirmation and defense.

Every cloud voice product that existed in 2019 operated on the same basic model. The recordings go to servers. The servers are operated by people. Quality assurance requires human review. The question was never whether this was happening - it was whether users knew.

They didn't. And when they found out, the consistent response was not to change the architecture but to change the policy. Opt-in consent. Audio deletion tools. Restricted contractor access. Better disclosures. All of these are real improvements. None of them change the fundamental fact: the audio still leaves the device.

## The architecture that prevents this

When voice processing happens on your device, there are no contractors. Not because the company decided not to employ them. Because there is no server receiving the audio that would need human review. The chain that produces the problem doesn't exist.

ToolPiper's voice dictation uses Parakeet v3 running on your Mac's Neural Engine. The audio goes from your microphone to a model in local memory. No network request. No Amazon server. No account number linking to your address. No contractor reviewing your recording. Not because of a policy. Because of an architecture.

The Alexa case is why "trust us" is not a sufficient privacy model for voice AI. It's why verifiable architecture matters more than privacy policies. And it's why the answer to "is this private?" should be something you can confirm yourself rather than something you accept on faith.

Download ToolPiper at [modelpiper.com](https://modelpiper.com). Run voice dictation. Watch the network tab in Activity Monitor. Nothing moves.

_Part of the [Voice AI Privacy series](/blog/why-voice-ai-should-stay-local). Related: [Apple's $95M Settlement](/blog/apple-siri-privacy-settlement). [Google's $68M Settlement](/blog/google-assistant-privacy-settlement). [Is ToolPiper Safe?](/blog/is-toolpiper-safe)_

## FAQ

### Did Amazon employees listen to Alexa recordings?

Yes. Amazon confirmed in 2019 that it employed thousands of workers in multiple countries to listen to and transcribe Alexa voice recordings, reviewing up to 1,000 clips per shift. Contractors reported hearing private medical conversations, domestic disputes, children's voices, and at least one recording that appeared to depict a sexual assault. Amazon defended the practice as necessary for improving Alexa's accuracy.

### What was the Amazon Alexa FTC settlement?

Amazon agreed to pay $25 million in May 2023 to settle FTC and Department of Justice allegations related to Alexa privacy violations. The settlement focused primarily on Amazon's failure to delete children's voice recordings and associated data when parents requested deletion. The settlement came four years after Bloomberg's investigation exposed the broader contractor listening program.

### Could Amazon contractors access your home address?

A follow-up Bloomberg investigation found that Alexa reviewers could access customers' home addresses through account numbers linked to the anonymized recordings. Recordings were associated with account numbers and device serial numbers rather than names, but those numbers linked to delivery addresses on file with Amazon.

### Is Alexa private now?

Amazon added opt-in consent and audio deletion tools following the 2019 controversy. However, Alexa still processes voice commands through Amazon's cloud infrastructure. For voice AI that processes audio entirely on-device with no network request, local tools like ToolPiper run speech recognition on your Mac's Neural Engine - the audio never leaves your device and there is no contractor review possible.
