In a class action lawsuit filed last week, healthcare workers alleged that their Amazon Alexa-enabled devices may have recorded their conversations – including potentially protected information - without their intent.


In a storyline that could be ripped right out of Dave Egger's bestselling dystopian novel "The Circle", four plaintiffs raise concerns that Alexa may have captured sensitive information.  


The healthcare workers, who include a substance abuse counsellor and a healthcare customer service representative, say they work with HIPAA-protected information, while others say they have private conversations with patients. All say they either stopped using Alexa devices or purchased newer models with a mute function out of concern that their conversations may be unintentionally recorded, stored and listened to.


The lawsuit, which was filed in the Western District of Washington federal court, alleges "Amazon’s conduct in surreptitiously recording consumers has violated federal and state wiretapping, privacy, and consumer protection laws." 


THE COMPLAINTS

The plaintiffs have two major complaints: They say that users may unintentionally awaken Amazon Alexa-enabled devices and that Amazon uses human intelligence and AI to listen to, interpret and evaluate these records for its own business purposes.   


"Despite Alexa’s built-in listening and recording functionalities," argues the lawsuit, "Amazon failed to disclose that it makes, stores, analyses and uses recordings of these interactions at the time plaintiffs’ and putative class members’ purchased their Alexa devices,"


The lawsuit cites research that found that smart speakers are activated by words other than "wake words."  

In a 2020 study from Northeastern University, researchers found Amazon devices had activations with sentences including "I care about," "I messed up," and "I got something," as well as "head coach," "pickle" and "I'm sorry" and some of the activations were long enough to record potentially sensitive audio, according to researchers.


In response to privacy concerns, in 2019, Amazon announced an ongoing effort to ensure that voice recording transcripts would be deleted from Alexa's servers when customers deleted them. And in 2020, Amazon allowed customers to opt out of human annotation of transcribed data and that they can automatically delete old voice recordings.


"By then, Amazon’s analysts may have already listened to the recordings before that ability was enabled," says the lawsuit.  


Amazon says it stores voice recordings to personalize and improve its software and that users can delete their records from the software. Most people are unlikely to read the terms of service, and are unaware that Amazon records them for these purposes. 


From the lawsuit:

"Plaintiffs expected [the] Alexa Device to only 'listen' when prompted by the use of the 'wake word,' and did not expect that recordings would be intercepted, stored, or evaluated by Amazon. Had Plaintiffs known that Amazon permanently stored and listed [sic] to recordings made by its Alexa device, Plaintiffs would have either not purchased the Alexa Device or demanded to pay less." 


Previously claimants had to enter arbitration as individuals, but as of June 2021, Amazon’s terms of service now allows customers file class-action suits against the company in state or federal court. All cases must be filed in King County, Washington, where Amazon is based. Some 75,000 Alexa-related cases, likely adding up to tens of millions of dollars in filing fees payable by Amazon have emerged in the past 16 months. 


Source: Law.com

Photo: Amazon



«« How To Recover From A Ransomware Attack? 5 Top Tips


WHO Report: AI in Health - 6 Guiding Principles for its Design and Use »»



Latest Articles

Artificial Intelligence, privacy, Data Privacy, Amazon, Alexa, AI technologies, lawsuit In a class action lawsuit filed last week, healthcare workers alleged that their Amazon Alexa-enabled devices may have recorded their conversations – in...