$273 million. That's what Amazon, Apple, Google, and Zoom have paid in legal settlements for privacy violations related to voice data and cloud AI - as of 2026.

Each settlement was described by the company as resolving the matter without admitting wrongdoing. Each company made changes to its privacy practices after each incident. Each company had privacy policies that technically disclosed the relevant practices in the first place. None of that stopped the violations. None of it restored privacy to the users whose conversations were reviewed, shared, or used for training without their meaningful understanding of what they were agreeing to.

The $273 million is not the interesting number. The interesting number is zero - the amount you spend on settlements when voice processing happens on your device and never reaches a server.

Amazon Alexa: $25 million

Amazon employed thousands of workers in the US, Costa Rica, and Romania to listen to and transcribe Alexa recordings. Contractors reported hearing private medical conversations, domestic disputes, and at least one incident consistent with a sexual assault. Contractors could access customers' home addresses through account numbers linked to the recordings. Amazon confirmed the program and defended it as necessary for improving Alexa.

The FTC settlement in 2023 focused primarily on children's data - Amazon's failure to delete voice recordings when parents requested deletion. The $25 million resolved those specific violations without covering the full scope of the contractor listening program.

Full account: Amazon Alexa's Listening Program.

Apple Siri: $95 million

Apple contractors heard confidential Siri recordings including medical discussions, business negotiations, drug deals, and intimate conversations. Many were triggered without intentional activation - devices recording conversations users had no idea were being captured. Apple suspended the program in 2019 and made it opt-in with iOS 13.2.

The class-action settlement in 2025 was $95 million. It covered US residents who owned a Siri-enabled device between 2014 and 2024. Eligible claimants can receive up to $20 per device for a maximum of five devices.

Full account: Apple's $95M Siri Settlement.

Google Assistant: $68 million

A whistleblower leaked over 1,000 Google Assistant recordings to Belgian broadcaster VRT NWS, including 153 recordings captured without wake-word activation - triggered by background noise or ambient sound. Contractors could identify users' addresses despite nominal anonymization. Google confirmed the practice, calling it a "language expert review."

The class-action settlement in 2026 was $68 million. It is the most recent of the three, closing a legal chapter that began with a Flemish TV station and a single person who decided the public should know what they were hearing on the job.

Full account: Google's $68M Settlement.

Zoom: $85 million

Zoom paid $85 million in 2021 to settle a class-action lawsuit for sharing user video call data with Facebook, Google, and LinkedIn without user permission - and for falsely claiming to provide end-to-end encryption. In 2023, Zoom separately updated its terms to claim a perpetual license to use meeting content for AI training, reversed after public backlash while retaining a loophole for service-generated data.

Full account: Zoom's AI Training Controversy.

The pattern

Every settlement in this list follows the same structure: a cloud service sends user audio or data to remote infrastructure, the data is processed or reviewed in ways users didn't meaningfully understand, the practice is revealed through a whistleblower or investigation, the company confirms it and defends it, changes are made, legal action follows. The changes address the disclosed practice. The architecture that made the practice possible remains.

Amazon made changes. Alexa still sends audio to Amazon's servers. Apple made changes. Siri still processes voice in Apple's cloud. Google made changes. Assistant still routes through Google's infrastructure. Zoom revised its terms. Meeting audio still leaves your device.

The settlements are what the legal system extracted for these specific violations. They do not represent the full value that was taken from the people whose private conversations were reviewed without their meaningful consent. They do not restore the privacy that was lost. They quantify, imperfectly, the cost of a trust model that was not warranted.

Samsung: $0 settlement, unlimited cost

Samsung engineers pasted proprietary semiconductor source code and internal meeting transcripts into ChatGPT across three incidents in 20 days in 2023. No settlement has been paid - the data went to OpenAI, not from OpenAI to Samsung. The cost is the IP that left the building. Samsung banned all generative AI tools company-wide and began building an internal LLM.

The Samsung case appears in this accounting because it illustrates the other side of the cloud AI privacy equation. The settlements above document what happens when cloud AI companies misuse user data. Samsung documents what happens when users hand sensitive data to cloud AI companies who use it exactly as their terms permit.

Full account: The Samsung ChatGPT Leak.

Otter.ai: lawsuit pending

A 2025 class-action lawsuit against Otter.ai alleges recording meeting participants without proper multi-party consent - including a journalist's interview with a Uyghur activist whose safety depends on that conversation staying private. The case is ongoing.

Full account: AI Meeting Recorders and the Consent Problem.

The architecture that produces zero settlements

There are no settlements from companies whose voice AI processes audio on the user's device, because there are no servers receiving the audio, no contractors reviewing it, no third-party data sharing, and no training pipeline fed by user recordings.

This is not because local voice AI companies have better privacy policies. It's because they have a different architecture. The chain that produces these violations - audio leaves device, audio arrives at server, server is operated by people with access to the audio - doesn't exist when inference runs locally.

ToolPiper's voice features run on your Mac's Neural Engine. Parakeet v3 transcribes speech in local memory. The audio never reaches a server. No amount of policy revision, opt-in consent, or settlement can change the privacy of a recording that was never made outside your own device.

The $273 million is the industry's tab for asking users to trust an architecture that, at scale and over time, could not be trusted. The alternative doesn't require trust. It requires a chip in a Mac that has existed since 2020.

Download ToolPiper at modelpiper.com. Run voice dictation. Open Activity Monitor. Watch nothing happen on the network.

This is the roundup article for the Voice AI Privacy series. The full series: Why Voice AI Should Never Leave Your Device (pillar) - Amazon Alexa - Apple Siri - Google Assistant - Samsung / ChatGPT - Zoom - Meeting Recorders - Wispr Flow - Is ToolPiper Safe?