Apple reportedly agreed to pay $95 million to settle a privacy lawsuit centered on its voice assistant, Siri.
A preliminary settlement was filed Tuesday (Dec. 31) but must be approved by a U.S. district judge, Reuters reported Thursday (Jan. 2).
In the agreement, Apple denied wrongdoing, according to the report.
The proposed settlement also requires Apple to address the alleged privacy violations outlined in the lawsuit by confirming it has permanently deleted Siri audio recordings obtained before October 2019 and by publishing an explanation to users about how they can opt-in to a choice to improve Siri, The Wall Street Journal (WSJ) reported Thursday.
The plaintiffs alleged that when Siri was activated unintentionally, it shared the private discussions it overhead with Apple, and that Apple shared these communications with third parties without users’ consent, according to the WSJ report.
Two plaintiffs alleged that they received ads for products after mentioning them in private conversations; another said he got ads for a surgical treatment after discussing it with his doctor, per the Reuters report.
The period covered by the lawsuit runs from Sept. 17, 2014 — when Apple added the “Hey, Siri” feature that activates its voice assistant — to Dec. 31, 2024, according to Reuters.
The privacy lawsuit, which was originally pitched in February 2021, was thrown out by a judge who said there wasn’t sufficient evidence of privacy violation, was resubmitted, and was allowed to proceed in September 2021 when the judge said the plaintiffs made sufficient claims when they resubmitted the suit.
When considering the adoption of voice technology, consumers grapple with the balance between convenience and trust, according to the PYMNTS Intelligence report “How Consumers Want to Live in the Voice Economy.”
The report found that while consumers are more comfortable using voice technology for simpler and low-risk tasks like playing music, setting alarms and asking directions, they have concerns about data errors and security breaches that could compromise tasks that involve sensitive information, such as opening a bank account or disclosing personal and financial information while scheduling a doctor’s appointment.
These concerns have hindered the use of voice technology for more complex tasks, according to the report.