---
title: "Apple's $95 Million Siri Settlement: What Happened and What It Means"
description: "Apple settled a class-action lawsuit for $95 million in 2025 over Siri privacy violations. Contractors were listening to recordings including medical conversations, business negotiations, and intimate moments. Here's the full account."
date: 2026-04-14
author: "Ben Racicot"
tags: ["Privacy", "Voice", "macOS", "Speech to Text", "Security"]
type: "article"
canonical: "https://modelpiper.com/blog/apple-siri-privacy-settlement/"
---

# Apple's $95 Million Siri Settlement: What Happened and What It Means

> Apple settled a class-action lawsuit for $95 million in 2025 over Siri privacy violations. Contractors were listening to recordings including medical conversations, business negotiations, and intimate moments. Here's the full account.

## TL;DR

In 2019, a whistleblower revealed that Apple employed contractors to review Siri recordings for quality assurance. Contractors regularly heard confidential medical information, business deals, drug negotiations, and intimate conversations - often from recordings triggered without intentional activation. Apple suspended the program, made it opt-in, and in 2025 settled a class-action lawsuit for $95 million. US users who owned a Siri device between 2014 and 2024 can claim up to $20 per device.

Apple's privacy reputation is carefully built. "What happens on your iPhone stays on your iPhone." Billboards. Ad campaigns. Years of positioning. In 2019, a whistleblower described what was actually happening with Siri recordings. In 2025, Apple settled a class-action lawsuit over it for $95 million.

This is not ancient history. The settlement covered Siri device owners from 2014 to 2024. If you owned an iPhone, iPad, Apple Watch, HomePod, or Mac with Siri during that decade, your recordings were part of the program being litigated.

## What did Apple do with Siri recordings?

Apple employed contractors to review a percentage of Siri voice recordings for quality assurance purposes. Contractors regularly heard confidential conversations including medical discussions, business negotiations, and intimate moments - many of which were recorded without intentional activation of Siri.

The practice was reported in August 2019 by [MIT Technology Review](https://www.technologyreview.com/2019/07/29/134008/apple-contractors-hear-confidential-details-from-siri-recordings/), based on accounts from a whistleblower named Thomas le Bonniec. Contractors described reviewing recordings that included:

Conversations between doctors and patients discussing diagnoses and treatment plans. Business negotiations. Drug deals. Couples having private conversations. All captured by Siri activating without the user's awareness - triggered by background noise, TV audio, or ambient sound that resembled the wake phrase.

The recordings were technically anonymized, but contractors reported they could often identify the person and location from context. A contractor who heard someone discuss their address, describe their home, and mention their doctor's name is not working with truly anonymous data.

## What did Apple's privacy policy say?

Apple's privacy documentation at the time acknowledged that Siri data was reviewed to improve the service. The disclosure existed. What it lacked was any practical prominence - the kind of placement that would lead an ordinary user to understand that a human contractor might hear their conversation with their doctor.

This is the consistent pattern across every cloud voice privacy incident of this period. Amazon had it. Google had it. Microsoft had it. The disclosure was real. The informed consent was not. A privacy policy disclosure that requires a legal education to find and interpret is not the same as a user understanding what they're agreeing to.

## How did Apple respond?

Apple's initial response in August 2019 was to suspend the Siri grading program entirely. In October 2019, with the release of iOS 13.2, Apple made significant changes: users could opt out of Siri recording review entirely, could delete their Siri audio history, and Apple stopped retaining audio recordings by default - switching to computer-generated transcripts only. Human review going forward was limited to Apple employees rather than contractors.

These were substantive changes. Apple's response was faster and more complete than most companies in the same situation. The opt-in model and audio deletion capability were genuinely useful controls.

They didn't prevent the lawsuit. The ten-year period of recording review before those changes - 2014 to 2019 - was the basis for the class action.

## The $95 million settlement

Apple settled the class-action lawsuit in 2025 for $95 million without admitting wrongdoing. The settlement covers US residents who owned a Siri-enabled device between September 17, 2014 and December 31, 2024. Eligible claimants can receive up to $20 per device, for a maximum of five devices - up to $100 per person.

$95 million is a large number. It is also approximately three hours of Apple's revenue. The financial cost to Apple is not the meaningful part of this story. The meaningful part is what it confirms: a decade of users speaking to Siri with an expectation of privacy that was not matched by the actual architecture of the product.

## What this reveals about cloud voice architecture

Apple is not a bad actor in this story. The Siri grading program was a standard industry practice - Amazon, Google, and Microsoft all did the same thing in the same period. Apple's post-incident response was more thorough than most. The $95 million settlement represents accountability for past practice, not ongoing misconduct.

What the incident reveals is structural. Cloud voice AI requires sending audio to remote servers for processing. Remote servers are operated by people. Quality assurance on a voice AI product requires human review of some percentage of recordings to catch errors the automated system misses. These are not independent facts - they are a chain. The architecture produces the outcome.

No privacy policy change, no opt-in mechanism, no compliance certification changes the chain. They govern what happens at specific points in it. The audio still travels. The server still receives it. The people who operate the server still have access to it.

Local voice inference breaks the chain at the first link. When Parakeet v3 runs on your Mac's Neural Engine, the audio goes from your microphone to a model in local memory. There are no contractors. There is no server. There is no chain to govern because the chain doesn't exist.

Apple knows this better than anyone. Their own privacy marketing has emphasized on-device processing for years. Siri itself has moved more processing to on-device with each hardware generation. The direction is clear. The gap between where the industry is and where it should have been is what the settlement quantifies.

Download ToolPiper at [modelpiper.com](https://modelpiper.com) for voice AI that runs entirely on your Mac's Neural Engine.

_Part of the [Voice AI Privacy series](/blog/why-voice-ai-should-stay-local). Related: [Wispr Flow's Privacy Incident](/blog/wispr-flow-privacy-incident). [Is ToolPiper Safe?](/blog/is-toolpiper-safe) - how to verify local inference yourself._

## FAQ

### What is the Apple Siri privacy settlement?

Apple settled a class-action lawsuit in 2025 for $95 million over Siri privacy violations. The lawsuit alleged that Apple contractors listened to Siri recordings - including confidential medical conversations, business negotiations, and intimate moments - without adequate user disclosure. The settlement covers US residents who owned a Siri-enabled device between September 17, 2014 and December 31, 2024.

### Can I claim money from the Apple Siri settlement?

If you are a US resident who owned a Siri-enabled Apple device between September 17, 2014 and December 31, 2024, you may be eligible. Eligible claimants can receive up to $20 per device, for a maximum of five devices ($100 per person). Check the settlement administrator's website for claim instructions and deadlines.

### Did Apple contractors listen to Siri conversations?

Yes. Apple confirmed in 2019 that contractors reviewed a percentage of Siri recordings for quality assurance purposes. Contractors reported hearing confidential medical discussions, business negotiations, drug deals, and intimate conversations - many from recordings triggered without intentional Siri activation. Apple suspended the program in 2019 and made human review opt-in with the release of iOS 13.2.

### Is Siri private now?

Apple made significant improvements after the 2019 incident: opt-in consent for human review, audio deletion capability, and computer-generated transcripts as the default instead of retained audio recordings. However, Siri on most devices still requires sending voice data to Apple's servers for processing. On-device Siri processing has expanded but is not universal. For voice AI that never sends audio over the network, local inference tools like ToolPiper process everything on your device.
