Apple agrees to pay $153 million to settle proposed Siri eavesdropping lawsuit

The settlement would allow Apple to avoid going to court over a proposed class action lawsuit alleging that its virtual assistant, Siri, violated user privacy by inadvertently recording conversations. The company has denied any wrongdoing.

A silhouette of a person holding up their phone, with the Apple Siri logo in the background.

Siri is Apple's digital voice assistant. It was introduced more than a decade ago. Source: Getty / NurPhoto

Apple has agreed to pay US$95 million ($153 million) to settle a proposed class action lawsuit claiming that its voice-activated Siri assistant violated user privacy.

The proposed settlement, filed in an Oakland, California court, would resolve a five-year-old lawsuit revolving around allegations that Apple surreptitiously activated Siri to record conversations through iPhones and other devices equipped with the virtual assistant for more than a decade.

The alleged recordings occurred even when people didn't seek to activate the virtual assistant with the trigger words, "Hey, Siri".

The lawsuit asserted that some of the recorded conversations were then allegedly shared with advertisers in an attempt to sell their products to consumers who were more likely to be interested in the goods and services.

The allegations about a snoopy Siri appeared to contradict Apple's long-standing commitment to protecting its customers' privacy.

CEO Tim Cook has often framed this crusade as a fight to preserve "a fundamental human right".
LISTEN TO
Apple agrees to a multi-million dollar settlement in a class action alleging privacy violations image

Apple agrees to a multi-million dollar settlement in a class action alleging privacy violations

SBS News

03/01/202501:13

Apple claims no wrongdoing

Apple isn't acknowledging any wrongdoing in the settlement, which still must be approved by US district judge Jeffrey White. Lawyers in the case have proposed scheduling a 14 February court hearing in Oakland to review the terms.

If the settlement is approved, tens of millions of US-based consumers who owned iPhones and other Apple devices from 17 September 2014 through to the end of last year could file claims.
Each consumer could receive up to US$20 ($32) per Siri-equipped device covered by the settlement, although the payment could be reduced or increased, depending on the volume of claims.

Only 3 to 5 per cent of eligible consumers are expected to file claims, according to estimates in court documents.

Eligible consumers will be limited to seeking compensation on a maximum of five devices.

The settlement represents a sliver of the US$705 billion ($1.134 trillion) in profits that Apple has pocketed since September 2014.

How to protect your privacy

Whether built into standalone smart speakers, computers, tablets, or phones, voice assistants rely on listening for specific sound patterns, such as a "wake word" like "Alexa" or "OK, Google," to activate and record.

However, they can sometimes misinterpret sounds and begin recording unexpectedly.

These recordings are often transmitted to the manufacturer's servers, so you should take measures to protect both your interactions with the voice assistant and your private conversations with others.

Here are some practical steps you can take to protect your privacy:

Disable microphone access

Go to your phone's settings and restrict access to your microphone for all your apps.

If you're worried about how often Siri is listening to and recording you without your consent, turn off Apple's virtual helper by following these steps:

Navigate to Settings > Siri & Search.

Toggle off Listen for 'Hey Siri' and press the Side button for Siri.

Tap Turn Off Siri when a pop-up window appears.

See if recordings are being stored

Regardless of the type of voice assistant you use, you should be able to check whether recordings are automatically stored permanently.

You should be able to choose how long recordings are stored, or even to automatically delete them.
A woman using a virtual assistant at her desk while she uses a laptop.
There are some practical steps you can take to better ensure your privacy when using smart assistants. Source: Getty / alvarez
Check the privacy policy

Check the privacy policy for your voice assistant to understand how your audio recordings are handled and who can listen to them.

You may be able to change your settings to opt out of human review of recordings.

Share
Published 4 January 2025 12:56pm
Source: Reuters, AAP, SBS


Share this with family and friends


Get SBS News daily and direct to your Inbox

Sign up now for the latest news from Australia and around the world direct to your inbox.

By subscribing, you agree to SBS’s terms of service and privacy policy including receiving email updates from SBS.

Download our apps
SBS News
SBS Audio
SBS On Demand

Listen to our podcasts
An overview of the day's top stories from SBS News
Interviews and feature reports from SBS News
Your daily ten minute finance and business news wrap with SBS Finance Editor Ricardo Gonçalves.
A daily five minute news wrap for English learners and people with disability
Get the latest with our News podcasts on your favourite podcast apps.

Watch on SBS
SBS World News

SBS World News

Take a global view with Australia's most comprehensive world news service
Watch the latest news videos from Australia and across the world