Loading stock data...

Apple agrees to pay $95 million settlement for spying on users through Siri technology

236790 Apple watch 9 Ultra 2 AKrales 0356

Apple Agrees to $95 Million Settlement Over Siri Eavesdropping

Introduction

In a significant development, Apple has agreed to settle a class-action lawsuit over its Siri voice assistant inadvertently capturing conversations and potentially exposing users’ private information. The proposed settlement, reported by Bloomberg, could provide financial compensation to millions of US-based Apple product owners whose conversations were accidentally recorded by Siri. This article will delve into the details of the settlement, the allegations against Apple, and the broader implications for user privacy.

The Settlement

According to the proposed settlement, eligible claimants can receive up to $20 per device for up to five Siri-enabled devices, totaling a maximum payout of $100. However, individual payouts may be lower depending on the number of claimants. To qualify, users must have owned or purchased a Siri-enabled Apple product between September 17th, 2014, and December 31st, 2024, and swear under oath that they accidentally activated Siri during a conversation intended to be confidential or private.

Eligibility Requirements

To be eligible for the settlement, claimants must meet two key criteria:

  1. Product Eligibility: The device must be one of the following Siri-enabled Apple products:
    • iPhone
    • iPad
    • Apple Watch
    • MacBook
    • iMac
    • HomePod
    • iPod touch
    • Apple TV
  2. Accidental Activation: Claimants must swear under oath that they accidentally activated Siri during a conversation intended to be confidential or private.

The Allegations Against Apple

The initial class-action suit against Apple followed a 2019 report by The Guardian, which alleged that third-party contractors working on Siri quality control regularly heard confidential medical information, drug deals, and recordings of couples having sex. A whistleblower claimed that accidental triggers were common, citing examples such as the sound of a zipper triggering Siri.

Apple responded to the allegations by stating that only a small portion of Siri recordings were passed to contractors and later offered a formal apology. The company announced that it would no longer retain audio recordings.

Background on Siri Eavesdropping

While Siri is designed to be triggered by a deliberate wake word, accidental triggers have been reported in various instances. In some cases, users claimed that their iPhones had recorded them using Siri even when they hadn’t uttered the wake word. This has led to concerns about user privacy and the potential for confidential information to be exposed.

Similar Suits Against Google and Amazon

Apple is not alone in facing allegations of eavesdropping by voice assistants. Google and Amazon, which also use contractors that listen in on recorded conversations, including accidentally captured ones, have been similarly accused. A similar suit against Google is pending, highlighting the broader industry-wide issue of user privacy.

Conclusion

The proposed settlement marks a significant development in the ongoing saga of Siri eavesdropping. While the exact terms and payouts are still to be determined, the agreement could provide financial compensation to millions of US-based Apple product owners whose conversations were inadvertently captured by Siri. The allegations against Apple raise important questions about user privacy and the role of voice assistants in our daily lives.

Timeline

  • September 17th, 2014: Apple introduces Siri on iPhone 6
  • December 31st, 2024: Proposed settlement cutoff date for eligible devices

Related Articles

  • Google Defends Letting Human Workers Listen to Assistant Voice Conversations
  • Microsoft’s New Privacy Policy Admits Humans Are Listening to Some Skype and Cortana Recordings
  • Apple Apologizes for Siri Audio Recordings, Announces Privacy Changes Going Forward

Further Reading

For a deeper understanding of the issues surrounding Siri eavesdropping and user privacy, we recommend exploring the following resources:

  • The Guardian’s 2019 report on Siri quality control contractors
  • Apple’s formal apology and policy changes regarding audio recordings
  • A comprehensive guide to voice assistant security and user privacy

Note: This article is intended for informational purposes only. It does not provide legal advice or represent any claims against Apple or other companies involved in the suit.